Haiti Meaning
There are 2 meaning(s) for word Haiti
Meaning 1 : a republic in the West Indies on the western part of the island of Hispaniola; achieved independence from France in 1804; the poorest and most illiterate nation in the western hemisphere
Synonyms : republic of haiti
Synonyms : republic of haiti
