View profile

Twitter users protest Alex Jones with a viral block list

August 15 · Issue #187 · View online
The Interface
Last week, we talked about why Facebook banned Alex Jones — and Twitter didn’t. Facebook saw that Jones, who had already violated any number of the platform’s rules, had no intention of reforming himself. Twitter said first that Jones had not broken any rules; and then — after a CNN’s Oliver Darcy showed the company a series of offending tweets — that he had, but not enough to get banned.
Late on Tuesday, Twitter took another half-step toward banning Jones — suspending him for a week, after posted a video on Twitter in which he encouraged his followers to get their “battle rifles” in anticipation of all-out war with his enemies.
In the mind of Jack Dorsey, Twitter’s co-founder and CEO, this suspension represented an opportunity for Jones to reflect on his bad behavior. “I feel any suspension, whether it be a permanent or a temporary one, makes someone think about their actions and their behaviors,” Dorsey told NBC News’ Lester Holt, in one of two interviews he did on Wednesday.
In the spirit of thinking about their actions and behaviors, Jones’ crew more or less immediately posted the battle-rifles video to the separate Infowars account. That earned the Infowars account a weeklong suspension of its own. Twitter being Twitter, the offending video remained viewable on Twitter-owned Periscope for nearly a day afterward. (Elsewhere in Twitter being Twitter, the Jones account continued to tweet for some time after his suspension, because it turns out that if you schedule tweets to post before you get suspended those tweets will continue to post just fine.)
After introducing this round of half measures, Dorsey sat down with the Washington Post’s Tony Romm and Elizabeth Dwoskin to announce that he was “rethinking the core of how Twitter works.”
“The most important thing that we can do is we look at the incentives that we’re building into our product,” Dorsey said. “Because they do express a point of view of what we want people to do — and I don’t think they are correct anymore.”
A now common criticism of Twitter holds that the viral mechanics through which tweets spread encourage the polarization of the audience into warring tribes. (See this Ezra Klein piece from last week.) That’s one way to explain why malicious users like Jones are able to thrive on social networks: their bombastic speech attracts a wave of initial attention, and platform algorithms help them find a much larger audience than they ever would otherwise. It’s in this sense that “incentives built into the product,” as Dorsey calls them, bear reconsideration.
Dorsey has more ideas. Labeling automated bots to distinguish them from accounts run by real people, for example. Or this one, cribbed from YouTube:
One solution Twitter is exploring is to surround false tweets with factual context, Dorsey said. Earlier this week, a tweet from an account that parodied Peter Strzok, an FBI agent fired for his anti-Trump text messages, called the president a “madman” and has garnered more than 56,000 retweets. More context about a tweet, including “tweets that call it out as obviously fake,” could help people “make judgments for themselves,” Dorsey said.
This is all fine, so far as it goes. Along with other tech leaders, Dorsey is expected to testify next month at a Senate hearing about information campaigns in politics. It makes sense that the CEO of Twitter would seek to convey a sense of urgency around solving the problems that have bedeviled the platform for many years now.
And yet at the same time, Twitter has never lacked for ideas. Ask anyone who ever worked there: any feature suggestion you could offer had already been debated ad nauseam. The problem always came down to the details, to the implementation, to how you were going to ship the damned thing.
That’s why I can view Dorsey’s vague promises on Wednesday only through the prism of the Alex Jones saga. Twitter was the very last of its peers to take any action against the Infowars host, and even when it did decide to punish him, it did so in the most lenient possible terms.
It offered Jones a loophole that let him keep tweeting. It left the offending video up for many hours. And it promised Jones that he could return — and in just a week, too. Twitter knew it had to punish Jones for his behavior. The trouble, as always for this company, was in the details.
But as the company dithers, its users are organizing. This week, Grab Your Wallet founder Shannon Coulter had a viral Twitter thread suggesting a concrete action Twitter users could take to protest Jones’ ongoing presence on the platform. Coulter organized a list containing the Twitter handles of the Fortune 500, then made them available as a collective block list. Protesters could install the block list with a couple of clicks, and once they have done so, any ads from those companies would not appear in their Twitter timelines.
As of yesterday, more than 50,000 people had installed her tool. Users have previously gifted Twitter the hashtag, the @ mention, and the retweet; Coulter may have just given us the viral block list. And while Twitter talks endlessly about what it might do someday, a growing faction in its user base is taking action right now.

In March, the United Nations said Facebook is used to incite violence against the Rohingya, a Muslim minority group. Ever since, regular reports have explored how Facebook failed to hire native-language speakers who could have identified hate speech on the platform as it began to spread, and ignored warnings from local groups and regional experts that the situation was getting out of hand.
Reuters’ Steve Stecklow today delivered the most comprehensive account yet of Facebook’s misadventure in Myanmar. His piece reveals the existence of Operation Honey Badger, a content moderation shop focused on Asia that is run by Accenture on Facebook’s behalf. Despite the efforts of its 60 or so moderators, Reuters easily found 1,000 pieces of anti-Rohingya hate speech on Facebook.
In part, that’s because Facebook’s vaunted artificial intelligence systems are failing.
In Burmese, the post says: “Kill all the kalars that you see in Myanmar; none of them should be left alive.”
Facebook’s translation into English: “I shouldn’t have a rainbow in Myanmar.”
So what happens next? Vice‘s David Gilbert reports that Facebook is conducting a human rights audit “to assess its role in enabling ethnic violence and hate speech against its Rohingya Muslim minority.”
The audit, which Facebook confirmed, will be conducted by the San Francisco firm Business for Social Responsibility. Gilbert says the report could be finished by the end of this month. The company is also hiring for a variety of policy roles specific to Myanmar, a first for Facebook.
These are important steps, and while it’s unclear what action they might result in, they convey the appropriate degree of seriousness. Facebook — and the wider world — have a lot riding on whether the company gets it right. Activists have described similarly violent outbreaks of hate speech including Vietnam, India, Cambodia, and Sri Lanka. The conflict in Myanmar is bloody, but it is by no means unique.  
How social media took us from Tahrir Square to Donald Trump
How a Fake Group on Facebook Created Real Protests
Americans don't think the platforms are doing enough to fight fake news
Transgender Girl, 12, Is Violently Threatened After Facebook Post by Classmate’s Parent
WhatsApp Co-Founder’s ‘Rest and Vest’ Reward From Facebook: $450 Million
Meet The People Who Spend Their Free Time Removing Fake Accounts From Facebook
Google-Facebook Dominance Hurts Ad Tech Firms, Speeding Consolidation
Instagram users are reporting the same bizarre hack
Amazon Has YouTube Envy
People Raise $300M Through Birthday Fundraisers in First Year
Twitter’s Misguided Quest to Become a Forum for Everything
And here’s a former (I think?) Twitter employee Jared Gaut has a thread worth reading on why he’s taking a break from the service in the wake of Alex Jones-related inaction:
I’ve been a daily user of @twitter for the last 11 years - long before I started working at Twitter. I won’t be a “monthly active user” during the rest of Q3. I’m deleting the @twitter app from all my devices and signing out of all browsers. @jack, thread 👇
And finally ...
Jerry Seinfeld Says Jokes Are Not Real Life
Talk to me
Send me tips, questions, comments, human rights audits:
Did you enjoy this issue?
In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue