On Friday I published a report confirming what had been obvious to anyone who spends much time talking to people who work in content moderation: the job causes post-traumatic stress disorder
. The report was based largely on an extraordinary document that Accenture, which sells its content moderation services to Facebook, YouTube, and Twitter, among others, requires employees to acknowledge that their work can lead to PTSD — and to tell their managers about any negative changes to their mental health. Labor law experts told me the document could be construed as an illegal requirement to disclose a disability.
At the time, I had managed to confirm only that the document was distributed to workers in Austin, TX, as part of Accenture’s contract with YouTube. A few hours after my report, the Financial Times
reported that it had been distributed to moderators for Facebook in Europe as well
. Around that time, I confirmed that workers on the Facebook project in Texas had also been asked to sign it. Facebook told me it was unaware of any documents that Accenture made its workers sign, but declined to comment.
Throughout my reporting, I attempted to pin down Accenture on which workers, exactly, it had warned about PTSD. The company’s PR team told me it regularly asked workers to sign “these types of documents,” but wouldn’t speak to the specific risk of PTSD. Indeed, no Accenture flack would ever use the word “PTSD” in an email to me.
But with the confirmation that the document was distributed to both YouTube and Facebook workers, it seems clear that the company has acknowledged that its workplace is unsafe for some portion of its workforce. As to how many people are affected, and which roles are most likely to result in long-term mental health issues, Accenture has refused all comment.
Whenever I write about these issues, people write to me to ask what the solution is. We will clearly need human moderators for the foreseeable future. How do we create jobs that safe for the maximum number of workers? After speaking with more than 100 moderators, academics, labor experts, and company executives, here are five things I wish companies would do.
First, invest in research. We know that content moderation leads to PTSD, but we don’t know the frequency with which the condition occurs, or the roles most at risk for debilitating mental health issues. Nor have they investigated what level of exposure to disturbing content might be considered “safe.” It seems likely that those with sustained exposure to the most disturbing kind of photos and videos — violence and child exploitation — would be at the highest risk for PTSD. But companies ought to fund research into the issue and publish it. They’ve already confirmed that these jobs make the workforce ill — they owe it to their workforce to understand how and why that happens.
Second, properly disclose the risk.
Whenever I speak to a content moderator, I ask what the recruiter told them about the job. The results are all over the map. Some recruiters are quite straightforward in their explanations of how difficult the work is. Others actively lie to their recruits
, telling them that they’re going to be working on marketing or some other more benign job. It’s my view that PTSD risk should be disclosed to workers in the job description
. Companies should also explore suggesting that these jobs are not suitable for workers with existing mental health conditions that could be exacerbated by the work. Taking the approach that Accenture has — asking workers to acknowledge the risk only after they start the job — strikes me as completely backwards.
Third, set a lifetime cap for exposure to disturbing content. Companies should limit the amount of disturbing content a worker can view during a career in content moderation, using research-based guides to dictate safe levels of exposure. Determining those levels is likely going to be difficult — but companies owe it to their workforces to try.
Fourth, develop true career paths for content moderators. If you’re a police officer, you can be promoted from beat cop to detective to police chief. But if you’re policing the internet, you might be surprised to learn that content moderation is often a dead-end career. Maybe you’ll be promoted to “subject matter expert” and be paid a dollar more an hour. But workers rarely make the leap to other jobs they might be qualified for — particularly staff jobs at Facebook, Google, and Twitter, where they could make valuable contributions in policy, content analysis, trust and safety, customer support, and more.
If content moderation felt like the entry point to a career rather than a cul-de-sac, it would be a much better bargain for workers putting their health on the line. And every tech company would benefit from having workers at every level who have spent time on the front lines of user-generated content.
Fifth, offer mental health support to workers after they leave the job.
One reason content moderation jobs offer a bad bargain to workers is that you never know when PTSD might strike. I’ve met workers who first developed symptoms after a year, and others who had their first panic attacks during training. Naturally, these employees are among the most likely to leave their jobs — either because they found other work, or because their job performance suffered and they were fired. But their symptoms will persist indefinitely — in December I profiled a former Google moderator who still had panic attacks two years after quitting
. Tech companies need to treat these workers like the US government treats veterans, and offer them free (or heavily subsidized) mental health care for some extended period after they leave the job.
Not all will need or take advantage of it. But by offering post-employment support, these companies will send a powerful signal that they take the health of all their employees seriously. And given that these companies only function — and make billions — on the backs of their outsourced content moderators, taking good care of them during and after their tours of duty strikes me as the very least that their employers can do.