How Did the Role of Women In America Really Change After the First World War ?

Authors Avatar

HOW DID THE ROLE OF WOMEN IN AMERICA REALLY CHANGE AFTER THE FIRST WORLD WAR ?

Before World war 1, the women’s place was in the home. Her job was to clean and look after the house, take care of the children and have a meal prepared for the Husband when he came home from work. They were not considered able to work outside the home. Women had a lower status than men in society. They were not even able to vote. During the first world war the women had to take over a lot of the men’s jobs as all the able men had gone over to Europe to fight in the war. This was a chance for the women of America to prove that they could do the jobs normally associated only with men, and that they could do them just as well as their male counterparts. Women’s stance towards commonly held values were questioned, mainly by the younger generation. This was around the 1920’s. Although not thought of like this at the time, the first world war changed the face of society. Women had gained more freedom.

Join now!

Women were starting to question some of the laws that had been enforced on them up until then. Susan Anthony formed the international women’s suffrage alliance in the early 1900’s. It campaigned for voting to be allowed for women as well as men, and later amalgamated with the British suffragette movement which was led by Emmeline Pankhurst. Alice Paul and Lucy Burns formed the congressional union (which later became the National Women’s party) in 1912. This group wanted an amendment that would give the right to vote to all American women. BY 1919 women could vote in 29 states and ...

This is a preview of the whole essay