Respuesta :

working outside the home

hope this helps :)

A major change woman experienced during the post world war 1 era was that they started:

  • Post World War I, women started getting jobs and working. During the war, several men were called for "fighting the war" leaving the jobs they worked as vacant.
  • As there were no other men to fulfill these positions, women were called and offered jobs in place of men.
  • The women started working in offices and earning a living. This also "raised the status of women" and also respect for them.
  • Women were now considered an important part of the society.