Germany, Italy, and America; these counties became unified during the 19th century.
America is not imperialist. America has not claimed any land as their own since Hawaii and Alaska. Albiet this was in the middle 20th century, but America did not claim any sizable amount of land of their own for most of the 19th century. While America was a "power", it was not imperialist. So the answer to your question is "no". Also recategorizing.
To gain respect, expantion of territory, for adventure, military basis, the spreading of ideas(christianity)
Bulb
sources of raw materials and markets :)
Liberia, Sudan,
1873 was the 19th century (1800-1899).
Ottoman EMPIRE
Electricity/light bulb
Imperial nations, such as United Kingdom and France, benefited the most during the 19th century because they exploited their colonies for resources. The colonies of imperial nations benefited the least because they were exploited.
It led to nations gaining their independence from the Ottoman Empire.
18th century