Wednesday, March 6, 2013

Is/Has the United States Been Imperialistic?

       No, I don't believe that the United States has ever been imperialistic. Although we have wanted expansion and have gained land over time, we never wanted the land for a gain of colonies, which is the main goal of New Imperialism. In reality, we didn't want more land, but we wanted the economic benefits of some lands. America only fought wars for certain countries so that they could trade with them, not for the land itself. 

No comments:

Post a Comment