answersLogoWhite

0


Best Answer

The Territory of Florida was an organized incorporated territory of the United States that existed from March 30, 1822, until March 3, 1845, when it was admitted to the Union as theState of Florida. The territory was originally the Spanish colony of La Florida, which was ceded to the United States as part of the 1819 Adams-Onís Treaty.

User Avatar

Wiki User

βˆ™ 14y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

βˆ™ 10y ago

Florida became a territory on March 30, 1822. It was a territory until 1845 when it then became a state and was admitted to the union.

This answer is:
User Avatar

User Avatar

Anonymous

Lvl 1
βˆ™ 4y ago

1822

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: When did Florida become a territory?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What year did Tallahassee become capital of Florida?

Tallahassee was founded as the capital of the Florida Territory in 1824.


What happen when Florida become a us territory?

it shouted for joy and pooed its pants


Was Florida a state or a territory?

Florida was a territory state


Where is the Florida territory located-?

The Florida territory is now known as just Florida. it is located in the Southern part of the United States.


What power claimed the territory of Florida?

Spain claimed the territory of Florida


What is the difference between Florida and the Florida territory?

Florida territory included land that is now part of Alabama and Mississippi.


How did the U.S acquire the territory of Florida?

The US acquired the territory of Florida from Spain ceding it to the US.


How did the unites states gain the territory of Florida?

the US gained Florida territory by threatening to police the territory so the Spanish gave it to them.


What territory did the US acquire from Spain in 1819?

Florida


How do you get child support if the other party is not working in Arizona?

Maybe, they are working in Florida. Check there. Then, learn to speak English, Moron.


What country did the United States purchase from Florida?

The United States purchased NO country or territory from Florida (Florida is part of the United States).


Did the Louisiana Territory include all of Florida?

It included none of Florida.