View profile

Special report: The secret lives of Facebook moderators in America

Revue
 
Each day I sign off this newsletter asking you to send me your tips. In December, a reader took me up
 
February 25 · Issue #294 · View online
The Interface
Each day I sign off this newsletter asking you to send me your tips. In December, a reader took me up on my request, and messaged me asking to chat. The reader is one of Facebook’s 15,000 content moderators, and had grown concerned about the working conditions at the office.
Over the next three months, I would interview a dozen current and former moderators at Facebook’s Phoenix site, which is operated by an outsourcing company named Cognizant. Today we published my report, and I wanted to bring it directly to your inbox.
Here were some of my key findings:
  • Moderators in Phoenix will make just $28,800 per year, while the median Facebook employee earns $240,000 in salary, bonuses, and stock.
  • Employees can be fired after making just a handful of errors a week, and those who remain live in fear of former colleagues returning to seek vengeance. One man we spoke with started bringing a gun to work to protect himself.
  • In stark contrast to the perks lavished on Facebook employees, team leaders micro-manage content moderators’ every bathroom break. Two Muslim employees were ordered to stop praying during their nine minutes per day of allotted “wellness time.”
  • Employees have been found having sex inside stairwells and a room reserved for lactating mothers, in what one employee describes as “trauma bonding.”
  • Moderators cope with seeing traumatic images and videos by telling dark jokes about committing suicide, then smoking weed during breaks to numb their emotions. Moderators are routinely high at work.
  • Employees are developing PTSD-like symptoms after they leave the company, but are no longer eligible for any support from Facebook or Cognizant.
  • Employees have begun to embrace the fringe viewpoints of the videos and memes that they are supposed to moderate. The Phoenix site is home to a flat Earther and a Holocaust denier. A former employee told me that he no longer believes 9/11 was a terrorist attack.
Last week, after describing my initial findings to Facebook, the company invited me to the Phoenix site to see it for myself. (I’m told it’s the first time a reporter has visited an American content moderation site since the company began expanding its community operations team globally in 2017.) There I spoke with employees who told me that, while they found the job consistently challenging, they also felt safe, supported, and confident that they would be able to advance their careers there.
Facebook did not invent the idea of using what are effectively call centers to handle content moderation. Among others, the model is also used by Twitter and Google, and therefore YouTube. This model may have made more sense when the platforms were smaller, and played a less vital role in global speech. But we live in a different world today, and the call center model is straining under the weight. As I put it in my conclusion:
 As first responders on platforms with billions of users, they are performing a critical function of modern civil society, while being paid less than half as much as many others who work on the front lines. They do the work as long as they can — and when they leave, an NDA ensures that they retreat even further into the shadows.
I hope you’ll read the full story in The Verge. And when you do, I’m eager to hear what you think. I view this story as a beginning, not an end, to my reporting — and I’ll be sharing additional reporting from the past three months here in the newsletter all week. As always, you can email me at casey@theverge.com, or just reply to this note.

Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue