Meaning of germany
Definition of germany
(noun)
a
republic
in
central
Europe;
split
into East Germany and West Germany
after
World War II and reunited in 1990
Other information on germany
WIKIPEDIA results for
germany
Amazon results for
germany
Tweet