Florida Meaning

There are 1 meaning(s) for word Florida

Meaning 1 : a state in southeastern United States between the Atlantic and the Gulf of Mexico; one of the Confederate states during the American Civil War

    Synonyms : everglade state,  fl,  fla.,  sunshine state