View profile

Why Facebook needs a Supreme Court for content moderation

Revue
 
What belongs on Facebook? It's a central question in our current reckoning over social media, and giv
 
August 20 · Issue #190 · View online
The Interface
What belongs on Facebook? It’s a central question in our current reckoning over social media, and given the vastness of the company’s platform, it can be exceedingly difficult to answer. “Fake news is not your friend,” the company says — but you can still post as much as you want. Alex Jones’ conspiracy theories, which inspired years of harassment against the parent of Sandy Hook shooting victims, were fine until they suddenly weren’t. Everyone seems to agree that terrorism does not belong on Facebook, though there’s still more of it there than you might expect.
But imagine you could start from scratch. What would you rule in, and what would you rule out? That’s the frame of this new episode of Radiolab, which chronicles the evolution of Facebook’s content policy from a single sheet of paper into 27 pages of comically specific rules about nudity, sex, violence, and more.
The full hour-long podcast is well worth your time. It examines three moderation debates, of escalating seriousness. The first is about when it is appropriate to show breastfeeding — an area in which Facebook has gradually become more liberal.
The second is about when you can criticize what the law calls a protected class of people — a gender, or a religion, for example. This is an area where Facebook has generally gotten more conservative. At one time, criticism of “white men” was prohibited — both words there are protective categories — while “black children” was not. The reasoning was that “children” is a non-protected class, and you can say anything about a non-protected class, as Facebook has no way of knowing whether their race has anything to do with your antipathy.
“If the rule is that any time a protected class is mentioned it could be hate speech, what you are doing at that point is opening up just about every comment that’s ever made about anyone on Facebook to potentially be hate speech,” producer Simon Adler says on the show.
This policy has since been changed, and black children are now protected from the worst forms of hate speech. “We know that no matter where we draw this line, there are going to be some outcomes that people we don’t like,” Monika Bickert, Facebook’s head of product and counterterrorism policy, told Adler. “There are always going to be casualties. That’s why we continue to change the policies.”
The third debate is the one I found most compelling. It’s a tale of two content moderation decisions, made six months apart in 2013. The first came after the Boston Marathon bombing, when images of bombing victims were posted on Facebook. At the time, the company’s policy on carnage was “no insides on the outside” — which photos from the bombing clearly violated. Adler’s anonymous former moderators told him that after some debate, an unknown Facebook executive said the pictures should remain, because they were newsworthy.
Six months later, Facebook faced a similar dilemma in Mexico, where the government and the cartels were locked in a bloody conflict. Users began posting a video of a woman being beheaded — a particularly newsworthy video, given that the government had been publicly denying reports of cartel violence. But in this case, another unnamed executive called for the video to come down. The decision led to departures on the moderation team, a former moderator says:
I think it was a mistake. Because I felt like, like why do we  have these rules in place in the first place? And and it’s not the only reason, but it’s decisions like that that are the thing that precipitated me leaving.
Five years later, the company has tasked itself with making decisions like these at a global scale. It vastly expanded — and this year made public — the community guidelines by which it makes these decisions. And it committed to hiring 20,000 new employees to work on safety and security. Adler puts it this way:
Essentially what Facebook is trying to do is take the First Amendment, this high-minded principle of American law, and turn it into an engineering manual that can be executed every four seconds, for any piece of content happening anywhere on the globe.
He then cuts to a former moderator in the Philippines. Her colleagues would frequently approve content without really studying, she says, in protest of the relatively low rate of pay — about $2.50 an hour when she worked there, she says. She also largely relied on her gut, she says, erring on the side of removing even innocent nudity. “If it’s going to disturb the young audience, then it should not be there,” she says.
What to make of all this? Radiolab ends on an uncharacteristically bleak note: “I think they will inevitably fail, but they have to try, and I think we should all be rooting for them,” Adler says.
But this sentiment assumes Facebook’s system of content moderation will never evolve beyond its policy handbook. In fact, the company has already given us at least two ideas for how it might change.
One, Facebook could expand the avenues that users have to appeal moderation decisions. It started to do this in April, I reported at the time:
Now users will be able to request that the company review takedowns of content they posted personally. If your post is taken down, you’ll be notified on Facebook with an option to “request review.” Facebook will review your request within 24 hours, it says, and if it decides it has made a mistake, it will restore the post and notify you. By the end of this year, if you have reported a post but been told it does not violate the community standards, you’ll be able to request a review for that as well.
That same month, Mark Zuckerberg told Ezra Klein that he could imagine Facebook one day having an independent Supreme Court to make moderation decisions:
Over the long term, what I’d really like to get to is an independent appeal. So maybe folks at Facebook make the first decision based on the community standards that are outlined, and then people can get a second opinion. You can imagine some sort of structure, almost like a Supreme Court, that is made up of independent folks who don’t work for Facebook, who ultimately make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world.
What Facebook is describing with these ideas is something like a system of justice — and there are very few things it is working on that I find more fascinating. For all the reasons laid out by Radiolab, a perfect content moderation regime likely is too much to hope for. But Facebook could build and support institutions that help it balance competing notions of free speech and a safe community. Ultimately, the question of what belongs on Facebook can’t be decided solely by the people who work there.

Democracy
Google’s Brin Cops to Plan to Reclaim Lost Decade in China
Facebook’s encryption fight will be harder than San Bernardino
'We won't let that happen:' Trump alleges social media censorship of conservatives
Facebook opens up to researchers — but not about 2016 election
Facebook Suspended a Latin American News Network and Gave Three Different Reasons Why
How the NY Times Omitted a Woman from a Silicon Valley Story
EU considers fines for tech companies that don’t remove terrorist content within an hour
QAnon and Pinterest Is Just the Beginning
Twitter has a problem with 'toxic' content. CEO Jack Dorsey says he's trying 'everything' to fix it
Elsewhere
How TripAdvisor changed travel
Model Tinder-Scams Men for Date Competition in Union Square
Adidas is partnering with Twitter to stream high school football games
How Facebook — yes, Facebook — might make MRIs faster
LinkedIn Will Allow Economics Researchers to Mine Its Data
Jeffree Star, Laura Lee, Gabriel Zamora & YouTube’s racist tweet drama
Launches
Snapchat's long-awaited redesign is smoother, can be enabled right now with root
The SurfSafe Browser Extension Will Save You From Fake Photos
Islands app for college students adds Facebook-like user directory
Giphy is launching its own take on stories with curated GIFs throughout the day
Takes
Jack Dorsey Breathes Life into the Right’s Favorite Twitter Conspiracy
The Future of Privacy: Disinformation
And finally ...
Fake Facebook adverts are making people double take all over London
Protest Stencil
Those honest facebook ads are really getting around... https://t.co/MI4zg7TmAH
5:33 AM - 15 Aug 2018
Talk to me
Send me tips, questions, comments, moderation policies: casey@theverge.com.
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue