View profile

Critics question the link between Facebook and refugee violence

Revue
 
On Tuesday, The New York Times' investigation of a study into how Facebook promoted anti-refugee viol
 
August 22 · Issue #192 · View online
The Interface
On Tuesday, The New York Times’ investigation of a study into how Facebook promoted anti-refugee violence in Germany galvanized discussion about how even normal political speech on the platform can drive users to extremes. Today, the report was criticized on the grounds that it may have unfairly linked correlation to causation, drawing more dramatic conclusions than can be supported by the evidence.
The case against the piece goes like this:
  • The study, which you can read here, has not been peer-reviewed.
  • The study authors could not measure actual Facebook usage, which is private, so they relied on problematic proxies. Their proxy for average, non-ideological usage of Facebook was the Nutella Germany page, with 32 million followers — but they managed to collect data on only 21,915 users who interacted with the page, and whose German location could be verified.
  • Data from the study is charted week by week, rather than in the moment. As Ben Thompson and others have pointed out, it seems just as possible that anti-refugee violence inspired Facebook posts than that Facebook posts inspired violence.
  • The article reported that Facebook was linked to a 50 percent increase in attacks on refugees; an update to the study revised that figure downward, to 35 percent.
  • The article represents a case of confirmation bias. People (like me!) who shared it tend to sympathetic to the idea that heavy usage of Facebook can be corrosive inside democracies, and so we accepted it without appropriate skepticism.
Some of these criticisms seem fairer to me than others. (The week-by-week chronology issue bothers me the most; I’ve reached out to the study’s authors, and will share anything I hear back from them in this space.) But none turns the original article on its head — or acknowledges that the Times journalists, Amanda Taub and Max Fisher, bolstered the study’s findings with their own on-the-ground reporting. (Thompson did note the additional reporting.)
And while the study hasn’t been peer-reviewed, the Times authors did seek input from other experts, who called the findings “credible, rigorous — and disturbing.” It also seems worth noting that the Economist also covered the study when it was first published earlier this year, and drew similar (if somewhat less agitated) conclusions.
In any case, I can’t imagine anyone reading the study and, even accounting for its flaws, not believing that further inquiry is warranted. “More study is needed” is perhaps the most common conclusion to be drawn from any study, and this one is no exception.
But as lots of folks noted online today, further study is difficult, because Facebook data is private by default. As New York’s Max Read put it: “The frustrating thing about the justified quibbles around this Facebook hate-crimes study is that Facebook itself could, in a couple hours, pull together a comprehensive data report that would answer all of the questions.”
The data in question is generally private for good reason — make it public, and you’ve got a Cambridge Analytica situation on your hands. But given the urgency of the question — does Facebook push normal political speech to extremes, inciting violence even in developed nations? — I wish Facebook would find a way.
Thompson doubts the company will:
Of course at best this sort of study will be done for internal consumption; I suspect it is more likely it won’t be done at all. Facebook has publicly buried its head in the sand about filter bubbles at least twice that I can remember, first in 2015 with a questionable study whose results were misinterpreted and last year on an earnings call.
The reason why seems clear: unlike fake news or Russian agents, which involve a bad actor the company can investigate and ban, the propagators of filter bubbles are users ourselves. To fix the problem is to eliminate the temporary emotional comfort that keeps users coming to Facebook multiple times a day, and that is if the problem can be fixed at all. Indeed, perhaps the most terrifying implication of this study is that, if true, the problem is endemic to social networks, which means to eliminate the former necessitates the elimination of the latter.
On the second point, I fear Thompson is right. And on the first — that Facebook will ignore studies like this — I can only hope he’s wrong.

Democracy
Democratic Party Says It Has Thwarted Attempted Hack of Voter Database
Taking Down More Coordinated Inauthentic Behavior
An Update on Our App Investigation
Zuckerberg and his co-founder pour millions into midterm initiatives
Facebook reinstated Crimson Hexagon, but questions linger
China shuts down blockchain news accounts, bans hotels in Beijing from hosting cryptocurrency events
Elsewhere
Nobody Trusts Facebook. Twitter Is a Hot Mess. What Is Snapchat Doing?
Facebook to Remove Data-Security App From Apple Store
The Tinder lawsuit is going to get nasty
Oculus Targeting Q1 2019 For Santa Cruz Release, Rift Ports Planned
Facebook will forego 30% share of Instant Games in-app revenue on Android
Posting Instagram Sponsored Content Is the New Summer Job
How to Share an Instagram Account With Your Significant Other
Launches
Facebook is working on mesh Wi-Fi to possibly bring to developing countries
Takes
It’s Too Late to Protect the 2018 Elections. But Here’s How the U.S. Can Prepare for 2020.
And finally ...
Patrick Gerard has your algorithmic failure of the day:
Patrick Gerard
Facebook is pushing that "share a memory" junk where they make custom videos out of your old photos to boost engagement and I just literally got shown a bunch of happy cartoon characters dancing on my mom's grave. 🤦‍♂️ https://t.co/6NKIXqqq9I
8:05 AM - 21 Aug 2018
Sorry Patrick!!
Talk to me
Send me tips, questions, comments, and alternate theories about violence against refugees: casey@theverge.com.
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue