View profile

Facebook says it limits the spread of hoaxes. But does it?

Revue
 
During the briefing with reporters that kicked off Facebook's latest round of soul searching over fak
 
July 20 · Issue #169 · View online
The Interface
During the briefing with reporters that kicked off Facebook’s latest round of soul searching over fake news, a spokeswoman explained why the company prefers to limit the distribution of misinformation rather than remove it from the platform. 
“While sharing fake news doesn’t violate our Community Standards set of policies, we do have strategies in place to deal with actors who repeatedly share false news,“ she said. "If content from a Page or domain is repeatedly given a ‘false’ rating from our third-party fact-checkers … we remove their monetization and advertising privileges to cut off financial incentives, and dramatically reduce the distribution of all of their Page-level or domain-level content on Facebook.”  
The subsequent debate has taken at face value Facebook’s intent to reduce the spread of misinformation on the platform. But how effective is Facebook at achieving its desired result? The company says posts flagged as false by fact checkers see their distribution reduced by up to 80 percent, an average of three days after they are posted. And yet journalists never seem to have much trouble digging up cases where pages devoted to the publication of misinformation continue to thrive.
At Poynter today, Daniel Funke investigates the case of YourNewsWire, a site whose posts have been debunked by fact-checkers more than twice as often as Infowars. Profiled last year in the Hollywood Reporter, YourNewsWire is devoted to promoting Pizzagate, anti-vaccination propaganda, and other conspiracies. (Confoundingly, at least to me, it’s run by two married gay men who describe themselves as "alienated liberals.”)
Mainstream news sites, whose articles are debunked far less often than YourNewsWire, have reported that their Facebook traffic has declined dramatically. Last month, Slate reported that referrals from Facebook had declined 87 percent from January 2017 to May of 2018. This figure is broadly consistent with other publishers I have spoken with.
With that context in mind, let’s see how YourNewsWire has fared:
While its engagement has ebbed and flowed, YourNewsWire hasn’t taken that big of a hit. In 2017, the site only saw its Facebook engagements decrease by less than 2 percent from 2016 — despite publishing about 1,600 fewer articles, according to BuzzSumo. That trend held for the first seven months of 2018 as well, during which YourNewsWire has published nearly 1,500 articles less than the same period in 2016 but only lost about 8 percent of its Facebook engagements. (All data was collected between July 12 and 20.) […]
YourNewsWire’s top story so far in 2018 had received a little more than 865,000 engagements as of publication. For comparison, The New York Times’ second most-engaging story from the same period got about 7,000 less engagements, according to BuzzSumo. YourNewsWire’s article, published in January, cites a fake quote from a Centers for Disease Control and Prevention official claiming that the flu shot is causing an outbreak of the disease.
OK. How about Infowars?
Between 2016 and 2017, the site — whose page is verified and has more than 900,000 likes on Facebook, but usually receives far less engagements than YourNewsWire — has seen its Facebook engagements decrease by more than 40 percent while publishing about 15 percent fewer articles. The change between the first seven months of 2016 and this year is less dramatic, with the site losing only about 10 percent of its engagements despite publishing more than 10,000 fewer stories.
However, InfoWars’ decline and stagnation in total Facebook engagements has been followed by more recent growth. While its best month was during the U.S. election in November 2016, it’s followed closely by June and July 2018. None of the fact-checkers Poynter analyzed debunked an InfoWars story in those months.
As Funke notes, when fact-checkers do flag these sites’ posts as false, their reach declines. But he speaks with fact checkers who say they are overwhelmed by the sheer volume of content, and often do not have the resources to fact-check even their most popular posts. And even when they do flag the posts, it can sometimes take nearly two weeks, by which time they will already have been seen by the majority of people who will ever come across them.
In short, our present information war is asymmetric. Facebook’s reliance on fact checkers to flag posts before reducing their distribution has had a limited effect on conspiracy-oriented sites who remain adept at generating engagement at any costs. And this year’s big algorithm change meant to privilege posts from friends and family seems to have a limited effect on conspiracy sites.
In the Times, Farhad Manjoo laid out the incoherence of Facebook’s recent moves around misinformation:
So to recap: Facebook is deeply committed to free expression and will allow people to post just about anything, including even denying the Holocaust. Unless, that is, if a Holocaust denial constitutes hate speech, in which case the company may take it down. But if a post contains a factual inaccuracy, it would not be removed, but it may be shown to very few people, reducing its impact.
On the other hand, if the misinformation has been determined to be inciting imminent violence, Facebook will remove it — even if it’s not hate speech. On the other other hand, if a site lies repeatedly, spouts conspiracy theories or even incites violence, it can maintain a presence on the site, because ultimately, there’s no falsehood that will get you kicked off Facebook.
Modeling Facebook’s terms of service on the First Amendment was supposed to ensure the company would have to make the fewest possible editorial decisions. But increasingly, the company finds itself forced to make difficult calls around the globe. I can understand the company’s institutional allergy to doing more. But as pressure mounts around the world, it seems unlikely Facebook will get away with doing less. 

Democracy
WhatsApp will drastically limit forwarding to stop the spread fake news, following violence in India and Myanmar
Facebook's plan to kill dangerous fake news is ambitious – and perhaps impossible
Twitter’s Vijaya Gadde on its approach to free speech and why it hasn’t banned Alex Jones.
The Russians Who Hacked The DNC Have Targeted At Least Three 2018 Campaigns, Microsoft Says
Elsewhere
Facebook Suspends Analytics Firm on Concerns About Sharing of Public User-Data
Zuckerberg’s sister: Banning Holocaust deniers won’t ‘make them go away'
Facebook Faces Delay to WhatsApp Payments in India
James Gunn exits Guardians of the Galaxy 3 after offensive tweets resurface
Spotify Still Won't Let Users Block Their Harassers On Its Platform
Snapchat co-founder Evan Spiegel responds to privacy, security concerns
Launches
Google, Facebook, Microsoft, and Twitter partner for ambitious new data project
Takes
The Mark Zuckerberg Facebook Holocaust denial controversy, explained
Facebook’s dilemma on Infowars and Holocaust denial: Purge hateful content or defend it?
And finally ...
Covfefe
Talk to me
Questions? Comments? Weekend plans? casey@theverge.com
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue