Meaning of haiti

Definition of haiti

(noun) a republic in the West Indies on the western part of the island of Hispaniola; achieved independence from France in 1804; the poorest and most illiterate nation in the western hemisphere
an island in the West Indies

Other information on haiti

WIKIPEDIA results for haiti
Amazon results for haiti