View profile

How Facebook's policies protect its worst users

Revue
 
Misinformation on Facebook kills. Hoaxes warning of child traffickers and organ harvesting on WhatsAp
 
July 12 · Issue #163 · View online
The Interface
Misinformation on Facebook kills. Hoaxes warning of child traffickers and organ harvesting on WhatsApp have led to the deaths of a dozen people since May. In Brazil, more than 500 people have died from yellow fever, among widespread misinformation on WhatsApp warning that the fever’s vaccine is deadly. In response, Facebook has promised to invest heavily in new product features, academic research, digital literacy campaigns, and fact-checking efforts, among other initiatives. 
But what if Facebook banned purveyors of misinformation from the platform instead? That was the question asked yesterday of John Hegeman, who recently replaced Adam Mosseri as the head of News Feed, at a small gathering of reporters in New York. How can Facebook claim to be serious about fighting misinformation, CNN’s Oliver Darcy asked Hegeman, while also offering notorious conspiracy site Infowars a platform for its 1 million followers? Hegeman replied that Facebook’s policy is not to remove false news.
“I guess just for being false that doesn’t violate the community standards,” Hegeman said, explaining that InfoWars has “not violated something that would result in them being taken down.”
Hegeman added, “I think part of the fundamental thing here is that we created Facebook to be a place where different people can have a voice. And different publishers have very different points of view.”
A spokeswoman followed up afterward with some additional context: “While sharing fake news doesn’t violate our Community Standards set of policies, we do have strategies in place to deal with actors who repeatedly share false news. If content from a Page or domain is repeatedly given a ‘false’ rating from our third-party fact-checkers … we remove their monetization and advertising privileges to cut off financial incentives, and dramatically reduce the distribution of all of their Page-level or domain-level content on Facebook.” 
Charlie Warzel says that isn’t enough. Facebook’s efforts to fight misinformation will be for naught, he writes, if it continues to offer a platform to Infowars and other bad-faith actors:
By focusing only on egregious examples of false news, Facebook allows its biggest purveyors of disingenuous conspiracies and polarizing content to operate with impunity while growing their audiences and expanding the footprint of low-quality information on the platform. All they need to know is how to game the system.
Despite investing considerable money into national ad campaigns and expensive mini documentaries, Facebook is not yet up to the challenge of vanquishing misinformation from its platform. As its videos and reporter Q&As take pains to note, Facebook knows the truth is messy and hard, but it’s still not clear if the company is ready to make the difficult choices to protect it.
Vox.com’s David Roberts, sounding a similar note, says Facebook’s policies disproportionately advantage conservatives, who expertly manipulate platforms into promoting misinformation and outright lies. “Conservatives have forced the same choice on institution after institution, person after person,” he writes in a Twitter thread. “Pursue accuracy, and live with relentless charges of bias & partisanship, or buy some protection by pursuing 'balance.’ One after another, they’ve made the wrong choice. And here we are.”
In February, after Infowars received its first YouTube strike for harassing a survivor of the Parkland shooting, I wrote about the big policy shift at tech platforms. After facing years of public pressure to leave controversial posts up, they suddenly found themselves beset by requests to take controversial posts down
Then as now, Infowars was at the center of the debate. But in issuing a strike, YouTube showed a willingness to ban the company’s channel. (A third strike results in account termination.) Facebook won’t go that far. The anonymous holder of Facebook’s Twitter account, gently sparring with reporters today, said so:
“We understand you strongly disagree with our position,” Facebook tweeted at the Times’ Kevin Roose. “We just don’t think banning Pages for sharing conspiracy theories or false news is the right way to go.”
The question I’m left with about these policies is: who or what do they serve to protect? Is it the principle of free speech? Is it Alex Jones and other bottom-feeding page administrators? Is it Facebook itself?
It seems inarguable to me that Facebook’s policies around misinformation offer more protection to publishers than they do to the users those publishers seek to exploit. As a statement about policy, “we just don’t think banning Pages for sharing conspiracy theories or false news is the right way to go” offers little comfort to the victims of hate crimes fueled by WhatsApp. Or to the rural poor in Brazil being told, falsely, that the vaccine that will save their lives will actually end it.
These are difficult questions, but they’re questions Facebook invited by growing itself to 2 billion users. What’s striking in the end is how far Facebook will go to protect the publishers who use its platform most maliciously — and how poorly its policies serve those publishers’ victims.

Democracy
SEC Probes Why Facebook Didn’t Warn Sooner on Privacy Lapse
Russian Influence Campaign Sought To Exploit Americans' Trust In Local News
Facebook 'closed' groups weren't as confidential as some thought
Elsewhere
Facebook Watch Is Struggling to Win Fans
Facebook's diversity efforts are failing black and Hispanic women
Is Instagram changing the way we design the world?
Why Some of Instagram's Biggest Memers Are Locking Their Accounts
Facebook paid $88 million this year to build out its Seattle area Oculus hub
How 20-Year-Old Kylie Jenner Built A $900 Million Fortune In Less Than 3 Years
Takes
The conventional wisdom about not feeding trolls makes online abuse worse
Have the Tech Giants Grown Too Powerful? That’s an Easy One
And finally ...
This video of a lemon rolling down a hill has been watched 3 million times
Talk to me
Questions? Comments? Favorite Infowars videos? casey@theverge.com
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue