Women’s roles began to change in the 1920s. Prior to this time, many women stayed home and took care of the kids. They rarely shared their views on political or contemporary issues in public. Many women were willing to assume a low-key role.
This all began to change in the 1920s. During World War I, women took jobs outside of the home that the men had previously filled. However, with the men fighting in World...
Women’s roles began to change in the 1920s. Prior to this time, many women stayed home and took care of the kids. They rarely shared their views on political or contemporary issues in public. Many women were willing to assume a low-key role.
This all began to change in the 1920s. During World War I, women took jobs outside of the home that the men had previously filled. However, with the men fighting in World War I, women stepped in and did these jobs. After the war, more women were no longer willing to play a low-key or subservient role. They began to express their views on various political and contemporary issues in public. They wore shorter dresses. They began to smoke and drink in public. Women also got the right to vote. Women were also no longer interested in staying in a marriage if they were unhappy. This led to an increase in the number of divorces.
Some people felt these changes were not good for our country. They felt our values were in decline. They felt that these changes represented a movement away from a religious way of living. They were concerned about the decline of the American family. Other people felt these changes were a sign of progress. They felt it was time for women to expand their roles in our society. They believed women should have more freedom to do the things they wanted to do. They viewed these changes a sign of progress.
No comments:
Post a Comment