Meaning of deutschland

Definition of deutschland

(noun) a republic in central Europe; split into East Germany and West Germany after World War II and reunited in 1990

Other information on deutschland

WIKIPEDIA results for deutschland
Amazon results for deutschland