View profile

The easiest thing Facebook could do for Myanmar

In March, human-rights investigators from the United Nations found that Facebook had played a role in
November 9 · Issue #244 · View online
The Interface
In March, human-rights investigators from the United Nations found that Facebook had played a role in spreading hate speech in Myanmar, fueling ethnic violence that has spurred more than 650,000 Rohingya Muslims to flee Myanmar’s Rakhine state into neighboring Bangladesh. The report, which came amid growing concerns about the way that social networks can incite violence, contained some of the most grave charges leveled against Facebook to date.
Chastened by the UN’s findings, Facebook quietly commissioned a study of its own — which it then released on the evening before the US midterm elections, when very few people would be paying attention. The report, which was conducted by the nonprofit Business for Social Responsibility (BSR), is a 62-page document that sets out to understand the dimensions of Facebook’s challenge in Myanmar and offer solutions to mitigate it.
After the dust from the midterms more or less cleared, I read the report. And while I spend more time reading hot takes than nonprofit takes-by-committee, I was struck by the degree to which a report that calls itself a “human rights impact assessment” does so little to assess the impact of Facebook on human rights in Myanmar.
The authors report speaking with about 60 people in Myanmar for their report, but they fail to explore any specific instances of hate speech on the platform or the resulting harms. Their analysis is limited to high-level, who-can-really-say equivocating. Its approach to understanding the situation on the ground in Myanmar appears to be primarily anecdotal, and its conclusions are the same as anyone who read a news wire story about the issue this spring.
“Though the actual relationship between content posted on Facebook and offline harm is not fully understood, Facebook has become a means for those seeking to spread hate and cause harm, and posts have been linked to offline violence,” the authors write, in one of many cases in which the passive voice serves to paper over their refusal to investigate.
I began reading the report in the hopes that it would clarify the connection between hate speech posted on social media and real-world violence. We are starving for knowledge about how unique platform mechanics such as share buttons and encryption contribute to lynch mobs. But instead, the authors choose to explore the current political dynamics in Myanmar at great length, and ultimately offer Facebook a to-do list of tasks that will let the company continue operating with minimal disruption to its business.
Most reports generated by consultants are destined to ride out eternity inside a neglected drawer, and BSR’s contribution to the Myanmar situation deserves a similar fate. (The nonprofit did not respond to a request for comment Friday afternoon.)
Fortunately, though, this week we got a second report on Facebook and Myanmar — and this one, I thought, was much more useful. It comes from the United Nation’s Office of the High Commissioner for Human Rights. Unlike BSR, the UN report asks why Facebook would enter Myanmar — or any other country rife with conflict — without first understanding how it would moderate content on the platform. They write:
Before entering any new market, particularly those with volatile ethnic, religious or other social tensions, Facebook and other social media platforms, including messenger systems, should conduct in-depth human rights impact assessments for their products, policies and operations, based on the national context and take mitigating measures to reduce risks as much as possible.
Instead, Facebook launched a country-specific version of Myanmar in 2015, and added it to its since-discontinued Free Basics program a year later. Soon, the company had 20 million users in the country — despite the fact that, due to peculiarities of the local language and Unicode, its non-Burmese speaking moderators had very little insight into what was happening on the platform.
The UN takes a broad view of the situation in Myanmar. The specific effect of social media is limited to a few pages toward the end of an extremely comprehensive report. And yet Facebook serves as the context for much of what the authors write: in a 444-page report, Facebook is mentioned 289 times.
Like BSR, the UN acknowledges that the free speech made possible on Facebook can contribute positively to Myanmar. But it also suggests Facebook make available examples of the hate speech it has removed from the platform, at least to a subset of researchers, so that its role can be better understood. This has privacy implications, which shouldn’t be taken lightly. But surely a middle ground can be found.
In the meantime, BSR and the UN agree on one thing, and it’s an easy one: Facebook ought to provide country-specific data on hate speech and other violations of the company’s community standards in Myanmar. We may not be able to say with certainty to what degree social networks contribute to ethnic violence — but we ought to be able to monitor flare-ups in hate speech on our largest social networks. Dehumanizing speech is so often the precursor to violence — and Facebook, if it took its role seriously, could help serve as an early-warning system.

Google in China: When ‘Don’t Be Evil’ Met the Great Firewall
Sundar Pichai of Google: ‘Technology Doesn’t Solve Humanity’s Problems’
#GoogleWalkout update: Collective action works, but we need to keep working.
The long history behind the Google Walkout
Facebook to End Forced Arbitration for Sexual-Harassment Claims
Amazon Execs Addressed Concerns About Amazon Rekognition And ICE At An All-Hands Meeting
The White House used a doctored video to tell a lie
Gab cries foul as Pennsylvania attorney general subpoenas DNS provider
PayPal is canceling accounts used by the Proud Boys and antifa groups
Former Instagram Leader Systrom Talks About ‘Unhealthy’ Internet Incentives
The most engaging Facebook publishers of October 2018
Facebook, Amazon, and Google: a pocket guide to breaking them up
Vox's Matt Yglesias got doxed on Twitter for his Tucker Carlson comments. Twitter allowed it.
With plenty of bad news, Snapchat has an early lead in wooing AR developers
In defense of TikTok, the joyful, slightly cringe-inducing spiritual successor to Vine
YouTube reverses ban for streamer who killed Red Dead Redemption 2 feminist
Fan fiction site AO3 is dealing with a free speech debate of its own
Facebook quietly launches a TikTok competitor app called Lasso
Vine’s successor Byte launches next spring
Facebook Stopped Russia. Is That Enough?
So I sent my mom that newfangled Facebook Portal
Facebook Portal Non-Review: Why I Didn’t Put Facebook’s Camera in My Home
And finally ...
Mark Zuckerberg is trolling Harvard students in a Facebook meme group
Talk to me
Send me tips, comments, questions, and Myanmar travel tips:
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue