In: Economics
How did World War II change the United States’ relationship with the rest of the world? What were the major events which prompted American involvement in the war, and what effect did the war have domestically? How did American foreign policy change after the end of World War II?
(minimum 200 words please, thank you!)
The involvement of America in World War II had a significant impact on the United States economy and labor force. The United States continued to recover from the impact of the Great Depression and the unemployment rate stood at around 25%. The rate soon shifted to our participation in the war. American factories were re-engineered to manufacture goods to sustain the war effort and the unemployment rate fell almost immediately to about 10%. As more men were sent away for battle, women were employed to take over their assembly line roles. In general , women had been discouraged from working outside the home prior to World War II.
Workers encountered changes in their own lives as industrial America changed. Fewer manufactured goods; more facilities offered. By 1956, a number held white-collar positions as general executives, teachers, salespeople, and employees of the government. Some companies provided guaranteed annual salary, long-term employment contracts and other benefits. These reforms weakened labor militancy and certain class distinctions started to disappear. In comparison, farmers faced difficult times. Productivity increases led to restructuring of the agricultural sector, as agriculture became a major business. Family farms, in turn, found it hard to compete and increasing numbers of farmers left the land.
In the post-World War II years, the US was generally guided by containment the policy of keeping communism from spreading beyond the countries already under its influence. The policy contributed to a world split by the Cold War, a war between the US and the Soviet Union. With the dissolution of the Soviet Union in 1991, containment no longer made sense, so the United States has redefined its foreign policy in the last ten years
Foreign policy for the United States has changed dramatically since the day of George Washington. While Americans often pay heed to their beloved founder 's guidance, obviously the world isn't the same. Today, the many people who shape American foreign policy accept the fact that the US is a member of a world community that can not afford to ignore the importance of getting along with it.