What They Can't Unsee
Scrolling through social media feeds or youtube vids can be fun, relaxing, an escape. But for thousands of content moderators across the world, it's a constant barrage of trauma. At every moment, they're protecting us from seeing the unthinkable: murders, suicides, rapes, torture of animals, children being harmed - things that are difficult to even mention. And these events could be happening in real-time.
The rest of us see none of this, but for weeks and months on end, it's all moderators see.
Content moderators are tasked with categorizing flagged content for removal. The disturbing content can range anywhere from live stabbings to animals being smashed against rocks. Watching even one video could be scarring.
In 2019, The Verge interviewed content moderators working at Cognizant, a company contracted by Facebook to manage flagged content. They found that many employees had developed a wide range of mental health issues: anxiety, depression, addiction, paranoia, and PTSD, amongst others. The job's effects on moderators' mental health were extreme. Strict yet ever-changing guidelines combined with impossible quotas, low pay, and inhumane working conditions created an abusive environment. Fearing for their jobs on a daily basis added to the already insurmountable stress caused by the graphic content and the sheer volume of it.
Thousands of content moderators' lives have been forever altered due to the mental health issues they've developed while working for Facebook or other platforms like Google and YouTube, and these companies have attempted to hide these health issues from the general public for years. Only now are they being held (somewhat) accountable. In May, Facebook agreed to pay a $52 million settlement to compensate moderators who've developed mental health issues on the job. But is it enough to manage trauma after the fact?
Additional information: “The Trauma Floor” & “The Terror Queue” via The Verge
The rest of us see none of this, but for weeks and months on end, it's all moderators see.
Content moderators are tasked with categorizing flagged content for removal. The disturbing content can range anywhere from live stabbings to animals being smashed against rocks. Watching even one video could be scarring.
In 2019, The Verge interviewed content moderators working at Cognizant, a company contracted by Facebook to manage flagged content. They found that many employees had developed a wide range of mental health issues: anxiety, depression, addiction, paranoia, and PTSD, amongst others. The job's effects on moderators' mental health were extreme. Strict yet ever-changing guidelines combined with impossible quotas, low pay, and inhumane working conditions created an abusive environment. Fearing for their jobs on a daily basis added to the already insurmountable stress caused by the graphic content and the sheer volume of it.
Thousands of content moderators' lives have been forever altered due to the mental health issues they've developed while working for Facebook or other platforms like Google and YouTube, and these companies have attempted to hide these health issues from the general public for years. Only now are they being held (somewhat) accountable. In May, Facebook agreed to pay a $52 million settlement to compensate moderators who've developed mental health issues on the job. But is it enough to manage trauma after the fact?
Additional information: “The Trauma Floor” & “The Terror Queue” via The Verge