Aggressive. Ethical. Experienced.

For U.S. women, workplaces can be hazardous to physical and mental health

On Behalf of | Jan 6, 2015 | Workplace Illness |

Prior to the 1960s and the women’s liberation movement, the majority of U.S. women did not work outside the home. In fact, information from the Centers for Disease Control and Prevention shows that during the 1950s, only roughly 34 percent of U.S. women held a job outside the home. Today, the CDC estimates that number is around 60 percent as more women choose to pursue higher degrees and delay marriage and starting a family. For some working women, the financial gains associated with working outside the home may come at a cost to their personal health and safety.

While today women work in nearly every field and profession, the majority work in positions that increase their risks of suffering workplace injuries such as “carpal tunnel syndrome, respiratory diseases and infectious diseases.” Working women are also more prone to developing work-related mental health problems like anxiety and depression which may stem from stressors associated with balancing work and family life and possible workplace discrimination and harassment.

With more than 13 million female employees, the health care industry remains a dominant employer of women and those who choose to work in health care are at an increased risk of suffering certain injuries and medical conditions. The CDC reports that women hold more than 90 percent of U.S. nursing and nursing aide jobs. The occupational hazards associated with the nursing profession are well-documented and include musculoskeletal injuries and disorders, exposure to illnesses and diseases and workplace violence.

These potential health and safety risks are compounded by long work hours, heavy lifting requirements, procedural and policy failures and increased contact with and limited protection from members of the public.

San Diego area women who work in the health care industry and have suffered work-related injuries may choose to discuss their case with an attorney. Employers are required to provide employees within the health care industry with certain protections and in cases where an employer’s action or inaction may have contributed to a worker’s injuries, legal action may be warranted.

Source:, “WOMEN’S SAFETY AND HEALTH ISSUES AT WORK,” 2015, “Women’s Safety and Health Issues at Work: Job Area: Health Care,” 2015



FindLaw Network
FindLaw Network