Meaning of namibia
Definition of namibia
(noun)
a
republic
in
southwestern
Africa on the
south
Atlantic
coast
(formerly called South West Africa); achieved
independence
from South Africa in 1990; the
greater
part
of Namibia forms
part
of the
high
Namibian
plateau
of South Africa
Other information on namibia
WIKIPEDIA results for
namibia
Amazon results for
namibia
Tweet