View profile

How Infowars keeps escaping Facebook's ban hammer

Revue
 
Two weeks ago, CNN's Oliver Darcy put a question to Facebook executives during an event in New York:
 
July 27 · Issue #174 · View online
The Interface
Two weeks ago, CNN’s Oliver Darcy put a question to Facebook executives during an event in New York: how can Facebook say it’s serious about fighting misinformation while also allowing the notorious conspiracy site Infowars to host a page there, where it has collected nearly 1 million followers? The tension Darcy identified wasn’t new. But it crystallized the a contradiction at the heart of Facebook’s efforts to clean up its platform. And we’ve been talking about it ever since.
Late Thursday night, Facebook took its first enforcement action against Jones since the current debate started. The company removed four videos that were posted to the pages of Alex Jones, Infowars’ founder and chief personality, and Infowars itself. Jones, who had violated Facebook’s community guidelines before, received a 30-day ban. Infowars’ page got off with a warning, although Facebook took the unusual step of saying the page is “close” to being unpublished.
The move came a day after YouTube issued a strike against Jones’ channel, after removing four videos itself. (Facebook won’t say which videos it removed, but the rationale it used to remove them — attacks on someone based on their religious affiliation or gender identity, and showing physical violence — suggests they are the same ones YouTube removed.)
These posts were removed for hate speech and violence, not misinformation. It’s likely Facebook would have removed them even without the extra attention on Infowars. But Jones’ behavior in the wake of recent enforcement actions shows how easily bad actors can skirt rules that were designed in the belief that most users will generally stick to them.
YouTube, for example, has a “three strikes policy.” Post three bad videos and your channel gets banned. But there’s a huge loophole, and Jones exploited it. As I reported earlier this week, users must log in to YouTube and view the strike against them before it gets counted. And if they posted multiple offending videos before they logged in, those offending videos are “bundled” into a single strike.
The idea is that the disciplinary process should educate a first-time offender. If someone posted three videos that violated copyright, for example, they might not understand what they did until YouTube notifies them. Better to give them a second chance, the thinking goes, than to ban their account instantly for three simultaneous violations.
Similarly, YouTube allows strikes to expire three months after they are issued. The idea is to give users a chance to rehabilitate themselves after they make a mistake. But viewed through the lens of Infowars, the policy begins to look like a free pass to post hate speech every 90 days or so.
Jones has proven himself capable at evading platforms’ well intentioned policies. The YouTube strike came with a ban on using the platform’s live-streaming features for 90 days — so Jones simply began appearing on the live streams of his associates, such as Ron Gibson. Here’s Sean Hollister at CNET:
YouTube is removing these streams and revoking livestreaming access to channels that host them, but it hasn’t stopped Infowars yet. Though YouTube shut down a livestream at Ron Gibson’s primary YouTube channel, he merely set up a second YouTube channel and is pointing people there. 
Meanwhile, Facebook’s profile-specific discipline similarly ignores Jones’ ability to roam across pages. Jones is banned from accessing his personal profile, but he still gets to appear on his daily live show, which is broadcast on Infowars and “The Alex Jones Channel.” The solution to being banned from one profile is simply to broadcast yourself from another one.
There were good reasons for tech platforms to set up disciplinary policies that strived to forgive their users. But given how easily they can be gamed, they would appear to be ripe for reconsideration.

Democracy
Britain's Fake News Inquiry Says Facebook And Google's Algorithms Should Be Audited By UK Regulators
Facebook deletes hundreds of posts under German hate-speech law
Trump appointee condemns Mark Zuckerberg's comments on Holocaust deniers
Setting the record straight on shadow banning
Mueller Examining Trump’s Tweets in Wide-Ranging Obstruction Inquiry
Elsewhere
Twitter warns fake account purge to keep erasing users, shares drop 19 percent
Instagram not an instant fix for ailing Facebook
Hard Questions: Who Reviews Objectionable Content on Facebook — And Is the Company Doing Enough to Support Them?
Takes
Twitter’s Algorithm Problem Is Not a Bug
Why unskippable Stories ads could revive Facebook
Talk to me
Questions? Comments? Weekend plans? casey@theverge.com
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue