View profile

A civil rights audit comes down on Facebook

Revue
 
As a longtime listener of her show, I was delighted to speak with Terry Gross on today's edition of F
 
July 1 · Issue #350 · View online
The Interface
As a longtime listener of her show, I was delighted to speak with Terry Gross on today’s edition of Fresh Air. The subject was my recent pieces on Facebook content moderators. Give it a listen!
By May of 2018, Facebook had received sustained criticism that the platform consistently enabled civil rights abuses. (Much of the criticism came after articles published by ProPublica demonstrating various ways that Facebook’s advertising platform could promote discrimination.) In response, the company announced that it had commissioned an independent civil rights audit — an effort to understand how Facebook promotes discrimination, and to develop recommendations for improvement.
In December, Facebook posted its first update about the audit, saying that the work had led to new efforts to fight voter suppression and encourage voter registration on the platform. And on Sunday, Facebook posted a second update. Joseph Cox summarizes it for us in Vice:
The report itself is split into four sections: content moderation and enforcement; advertising targeting practices; elections and census; and the civil rights accountability structure. With content moderation, the audit focused on harassment on Facebook; the under-enforcement of policies where hate speech is then left on the platform; and Facebook’s over-enforcement of hate speech policies where users have content removed that actually condemned or spoke out against hate speech. The audit was conducted with civil rights law firm Relman, Dane & Colfax and Megan Cacace, one of the firm’s partners. […]
There are two major developments from this update. The first is that Facebook will work to protect the integrity of the upcoming US Census just as it would a national election. “We’re building a team dedicated to these census efforts and introducing a new policy in the fall that protects against misinformation related to the census,” the company said in its blog post. “We’ll enforce it using artificial intelligence. We’ll also partner with non-partisan groups to help promote proactive participation in the census.”
The second, more consequential development is that Facebook is extending its ban on speech promoting white nationalism. Alex Hern reports in the Guardian:
White nationalism and white separatism were previously allowed on Facebook as the company considered only white “supremacy” to be in breach of its hate speech policies. However, in March 2019 it updated its rules to ban the explicit praise, support or representation of the former two ideologies as well.
Facebook’s chief operating officer, Sheryl Sandberg, said in response to the audit: “We’re addressing this by identifying hate slogans and symbols connected to white nationalism and white separatism to better enforce our policy.
“We also recently updated our policies so Facebook isn’t used to organise events that intimidate or harass people based on their race, religion or other parts of their identity. We now ban posts from people who intend to bring weapons anywhere to intimidate or harass others, or who encourage people to do the same. Civil rights leaders first flagged this trend to us, and it’s exactly the type of content our policies are meant to protect against.”
This is all fairly straightforward. Civil rights groups audited Facebook and found lots of hate speech, and they want the company to eliminate more of it. But reading the report, I couldn’t help but notice one set of voices missing from the discussion: the moderators whose job it is to do all that hate speech removal.
The moderators were on my mind, thanks to a message I had received over the weekend from one of them. The moderator, who identified as a queer person of color, told me that moderating hate speech is the hardest part of the job. They wrote:
I also see the recurring pattern of graphic violence being brought up, and it can be awful at times — but most of our Tampa site consist of people of color. The most draining thing for the majority of us is the extreme amount of hate speech we see very day. There is so much of it there is even a specific queue for it. It’s extremely depressing, and most of our on site “therapist” are older white people who are very out of touch with current times. 
Graphic violence might be 5% of what we see, but at least 60% of my job consists of looking over hate speech and seeing how much people hate people of color and LGBT people. 
[The project] is a complete mess, and I have no idea how it’s still running. Facebook changes its policy every five seconds, and then my bosses get mad at me for not having 98% because I deleted something that is now an ignore. The other day I had to leave up a page with borderline child pornography because I was told we can’t assume a person saying they are 17 years old means they are a minor. 
I absolutely hate that place. They promote and reward based off of a metric (scores) that can be easily manipulated. … It’s an absolute mess, and almost everyone there knows it. 
One of the ideas coming out of the civil rights audit is to create a dedicated hate speech queue. The idea, according to Facebook chief operating officer Sheryl Sandberg, is that moderators will get better at moderating hate speech if a subset of them are tasked with moderating it continuously. This seems plausible — and so does the possibility that these moderators will be traumatized from the daily exposure to discriminatory posts.
Facebook is only in the earliest stages of surveying moderators about their mental health, as part of an effort to establish a baseline it can improve over time. There are currently no caps on the amount of graphic or racist content that a moderator can be subjected to in a day, and the company says there is currently no research on what levels of exposure are safe for the human mind.
And even before its survey of moderator well-being is finished, Facebook is simultaneously undertaking a new experiment on the mental health of thousands of contractors like the one who wrote me. It’s classic move-fast Facebook — placate one group of vocal critics, even if it puts a less vocal group at risk — and I worry about the consequences. It would be a shame if a civil rights audit of the platform led to a new mental health crisis among its contractors.

Democracy
Trump officials weigh encryption crackdown
Virginia’s ‘revenge porn’ laws now officially cover deepfakes
Inside the Secret Border Patrol Facebook Group Where Agents Joke About Migrant Deaths and Post Sexist Memes
Big Data Supercharged Gerrymandering. It Could Help Stop It Too
Twitter Conspiracy Theorist Charged With a Felony in Lynch Threat Against Muslim Candidate
Trump Consultant Is Trolling Democrats With Biden Site That Isn’t Biden’s
Operation Tripoli
Elsewhere
Facebook’s new terms of service explains how ads get targeted
TikTok’s Videos Are Goofy. Its Strategy to Dominate Social Media Is Serious.
Memes Are the New Pop Stars: How TikTok Became the Future of the Music Industry
With costs rising, Facebook traffic arbitrage strategies lose effectiveness
Creator of DeepNude, App That Undresses Photos of Women, Takes It Offline
Gay dating app Jack’d settles complaint over exposing private photos
Launches
Walmart in Mexico launches grocery orders via WhatsApp
Bumble becomes one of the first major dating platforms to introduce in-app video and voice calls
Takes
Libra and taxes
And finally ...
Sometimes, seeing one good post on a social network can make you a lifetime customer:
Daenna Marie E.
my uncle just posted this
I’M NEVER DELETING FACEBOOK LMAOOOO https://t.co/FXHX3zCokz
9:30 AM - 28 Jun 2019
Talk to me
Send me tips, comments, questions, and civil rights violations: casey@theverge.com.
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue