How did World War II change the United States’
relationship with the rest of the world? What were the major events
which prompted American involvement in the war, and what effect did
the war have domestically? How did American foreign policy change
after the end of World War II?
(minimum 200 words please, thank you!)