Three things today — and I hope you’ll accept a truncated version of the newsletter as I flit around New York to various news outlets running my mouth about yesterday’s feature. (No one has ever complained that this newsletter was too short, so I’m optimistic.)
First, thanks to the 1.1 million people who have so far read Bodies in Seats
, our investigation into working conditions at a Facebook content moderation site in Tampa, FL. The video report we made to accompany the story
now has nearly 350,000 views, and I hope you’ll watch it if you haven’t already. My hope in writing this piece was to draw renewed attention to the sometimes desperate conditions of rank-and-file moderators, while also highlighting some of the people inside Facebook who are working to raise their standard of living. I’m grateful so many people have made time for it.
Discussion around the internet focused largely on the moderators’ descriptions of life working for Cognizant, the Facebook contractor who runs the site. “Throwing more moderators at the problem might help catch a few more horrifyingly violent or inappropriate videos from slipping through the cracks,” wrote Melanie Ehrenkranz in Gizmodo
, but in the absence of a fair and protected workforce, it sacrifices the well-being of these contractors for the company’s bottom line.“ At Boing Boing
, Xeni Jardin urged readers to delete their Facebook accounts
I’ve received an overwhelming number of emails from current and former moderators at sites around the world, for companies including Facebook, Microsoft, and YouTube. Over the next few weeks, I’ll be following up with everyone who reached out to continue to hone my understanding of this work.
Here are just a few of the messages I received over the past day.
- "I also worked as a content moderator. Needless to say, I’ve seen some things that I can’t unsee. I also think that I may have PTSD as well. What are you guys doing about it?”
- “I’m burned down both physically and emotionally from working on such content for so long and I need to leave before my health becomes an issue.”
- “The part that I hated more was the fact they really don’t care about fake news. I would say that content moderators have more training about how much of a female breast is too much to stay on the platform than about fake news (which is zero).”
- “I worked for the Brazilian and the Portuguese markets. Just saw your new vídeo about Facebook ex-employees and I would like to say that it’s the same here in Portugal, but the company who manages Facebook is Accenture.”
I also heard from some content moderators at YouTube, an area of intense interest to me. If you are one, or know one, and might be willing to share your experience, please get in touch.
Second, everyone is continuing to have a good time discussing the future of Facebook’s pseudo-cryptocurrency, Libra
. With at least one top Democrat calling for a “moratorium”
on its development, the Senate Banking Committee plans to hold a hearing July 16th
in an effort to determine what, exactly, Libra is, with Facebook blockchain chief David Marcus set to testify. No other witnesses have been called, but let’s assume Diamond and Silk will be there for reasons no one can adequately explain.
Which, in turn, turns into The problems with open and decentralized means the solutions are big and closed. If Marcus and Libra are able to create a narrative of “As you know, crypto is inevitable. It’s happening. But here’s a better, friendlier, more responsible version that you can understand”, then they’ll have done their jobs.
Don’t fall for it. But do watch it, though. I’ll be a fantastic illustration of what’s becoming the central drama of our current age of software, and potentially of the next very long time: this new kind of Scarcity that is “closed access”, which we simultaneously hate but want so very badly.
I continue to read Libra takes wherever I find them. (One hot take, from the online bank Current, is that Calibra stole its logo
I still cannot articulate a definition of Byzantine fault tolerance.
In the Wall Street Journal
, Rob Copeland reports that Google executives are considering “moving all children’s content into a separate product, the existing stand-alone YouTube Kids app, to better protect young viewers from objectionable videos,” according to “people briefed on the talks.” This option seems frankly insane, given that 500 hours of video are uploaded to the site every minute and YouTube has no way of immediately detecting which videos are kid-safe. And indeed, “a person close to the company” tells the Post
that it’s “highly improbable.” (Janko Roettgers makes a good case against trying to segregate kids’ content
Still, that the idea is even being debated shows how fraught discussions around YouTube have become. Typically, companies want to resolve these concerns in the least disruptive way possible. I expect that YouTube will still look and feel largely the same a year from now. But I’m quite curious what concessions might now be under discussion.