Before the army began accepting female soldiers, wars had a strange impact on women. With employers tasked to fill the vacancies in their workforce, many women found their first taste of gainful employment in the shadow of World Wars. Conversely, the mortal toll inherent to war, which could potentially claim their husbands and sons, was always leering in the corner of their minds. In America, World War II represents the cultural shift where women began to venture into the workforce in droves, but World War I was the catalyst for women's rights in the UK. However, even though the augmentation of gender roles can be linked back to World War I, history has become rife with misconceptions about how the era shaped women's rights.
In the video linked above, the YouTube channel Imperial War Museums goes into the specifics of women's roles in the UK during World War I.