answersLogoWhite

0

World War 1 impact America on the social, economic, and political fronts.i believe the American impact on the war was that Ally morale was down, and they were also experiencing the effect of many loses. America was fresh and they had many, many men. If they hadn't joined, then it is possible Germany and the Austo/Hungary empire could have won the war.
User Avatar

Wiki User

13y ago

What else can I help you with?