By now I hope you’ve read (or at least Pocketed) The Trauma Floor
, my investigation into working conditions at a Facebook content moderation site in Phoenix, Arizona. Thank you to everyone who read, shared, and discussed it: more than 700,000 people read the story in its first 24 hours, and it’s still going strong over at The Verge
It generated an enormous amount of discussion around the web — you can find a good roundup here at this Google News page
. It also spurred a great discussion on Twitter, as academics and former moderators shared their own experiences and perspectives on the subject.
All of this was, of course, a source of pride for me and my colleagues at The Verge, who hoped that our piece would spur discussion about the lives and livelihoods of contractors at Facebook and beyond. But it may have meant even more to my sources for the story, who followed the discussion closely and were moved by the kind words and compassion that so many commentators showed for them.
As the story gathered steam, I began to be inundated by messages from contractors around the world. Some work at for Facebook at Cognizant or one of the other big outsourcing firms; others work or have worked on behalf of Apple, Google, YouTube, Twitter, and SoundCloud. They poured out their hearts to me via text messages and emails, and I am still working to get back to every single one of them.
Some of the stories they told me would be familiar to you from my investigation: tales of oppressive micro-management; struggling to get by on $120 a day; and the lasting psychological damage brought about by repeated exposure to the worst the humanity can offer. Other stories caught me by surprise, though, and deserve fuller investigation.
I’ve spent the past day — and will spend the rest of the week — talking to other media outlets about my reporting. Today I spoke with NBC News, BuzzFeed’s AM2DM, Slate’s If/Then podcast, Cheddar, and Vox.com’s Today Explained. Tomorrow it’s on to CNN and CNBC. On Friday I’m talking to Justin Hendrix’s students at the NYU Media Lab.
Making these rounds allows me to tell the stories of these workers to an audience I usually can’t reach here or on The Verge. That, combined with responding to new sources eager to tell their stories, has occupied my every waking hour.
As a result, I’m offering an abbreviated version of the newsletter today, and will take off Wednesday and Thursday so that I can continue this work. I started The Interface both to highlight the reporting of others and to serve my own — and this week, in response to overwhelming interest in this subject, I feel my attention is best spent exploring how the call center model of content moderation is affecting workers around the world. So many folks who worked in this industry have told me that the discussion is long overdue — including two people who did this kind of work for MySpace, and are still haunted by it.
I hope this temporarily reduced schedule is OK with you — if you’ve ever felt overwhelmed by the newsletter, you may even enjoy a couple days off! I’ll be back next week with regular editions, before taking my annual Spring Break trip to Austin for South by Southwest.
The biggest question I’ve received from colleagues in the media since I published my piece is what tech companies can do to right the imbalance of power between full-time employees and contractors. My answer is simple: pay them more. Companies that earn billions of dollars in quarterly profits can afford to double the salary of someone making $28,800 a year, and they should.
A higher salary is not, of course, a panacea for the side effects of long-term content moderation. But it would make for a very good start.
“We have investigated the specific workplace issues raised in a recent report, previously taken action where necessary and have steps in place to continue to address these concerns and any others raised by our employees,” Cognizant said.
As part of that investigation, I went to Facebook’s Menlo Park, California headquarters for on-the-record interviews with the people in charge of the company’s sprawling content moderation operation. One of those leaders was Brian Doegan, who at the time was Facebook’s “Global Learning Leader.” Doegan was in charge of Facebook’s content moderator training practices—essentially, he helped set up the guidelines by which all moderators would be trained, and the best practices for actually training them.