View profile

How white supremacists are thriving on YouTube

Revue
 
What role do major institutions play in the promotion of extremism? Two days into this week, we've al
 
September 18 · Issue #208 · View online
The Interface
What role do major institutions play in the promotion of extremism? Two days into this week, we’ve already gotten two important looks at the issue.
On Monday I told you about a report from danah boyd about the media’s role in amplifying “digital martyrs” like Alex Jones. (I pasted the wrong link into the newsletter yesterday — come on, Newton! — and so if you haven’t read it yet, there you go.) 
Today comes a report from Rebecca Lewis looking at another kind of amplification: the closely linked network of conservative YouTube personalities who collaborate in videos and advance an extremist ideology. (Both reports, incidentally, come from the New York-based nonprofit Data and Society.)
Lewis set out to understand how YouTube in particular has become a thriving hub of far-right content. Starting with a handful of well-known conservative personalities, she began tracking their appearances on one another’s channels. When another personality popped up on one of these channels, she began charting that person’s path through YouTube as well. Eventually, she had watched hundreds of hours of video from 65 influencers across more than 80 channels.
After mapping the network, Lewis makes three findings.
  • These influencers built an alternative media network by emphasizing their relatability, “authenticity,” and accessibility to their fans. They portray themselves as social underdogs, outcasts, and victims, giving them a countercultural ethos that can be attractive to younger viewers.
  • The influencers have effectively promoted themselves using tactics including “ideological testimonials,” in which they recount their conversion from wayward leftists into right-thinking conservatives; search engine optimization, in which they use keywords common in more neutral and liberal-oriented videos to attract viewers; and “strategic controversy,” which is to say stunts.
  • The influencers encourage people to adopt a more radical set of views over time by first encouraging them to reject all non-ideological media, and then introducing them to extremist figures who offer alternative worldviews.
Lewis notes that she is not the first scholar to examine radicalization on YouTube; she cites Zeynep Tufekci’s New York Times piece and ex-YouTube employee Guillaume Chaslot’s work on the subject. Where she differs from her predecessors is in moving away from the now-standard critique that YouTube’s core problem is technological in nature. Previous work has focused on how quickly recommendation algorithms push viewers to extremist content; Lewis says the problem lies in the content itself. She writes:
While these articles identify a real problem, they treat radicalization as a fundamentally technical problem. What the section below showcases is that radicalization on YouTube is also a fundamentally social problem. Thus, even if YouTube altered or fully removed its content recommendation algorithms, the AIN would still provide a pathway for radicalization.
Lewis’s proposed solution is that YouTube should develop a strict value-based code of behavior, actively monitor the content of influencers’ videos, and discipline violators accordingly:
There is an undercurrent to this report that is worth making explicit: in many ways, YouTube is built to incentivize the behavior of these political influencers. YouTube monetizes influence for everyone, regardless of how harmful their belief systems are.The platform, and its parent company, have allowed racist, misogynist, and harassing content to remain online – and in many cases, to generate advertising revenue – as long as it does not explicitly include slurs. YouTube also profits directly from features like Super Chat which often incentivizes “shocking” content. In other words, the type of content and engagement created by the AIN fits neatly into YouTube’s business model.
The website similarly seeks policies that offer it protection for hosting user-generated content while simultaneously facing minimal liability for what those users say. This report has shown how these attempts at objectivity are being exploited by users who fundamentally reject objectivity as a valid stance. As a result, platforms like YouTube have an imperative to govern content and behavior for explicit values, such as the rejection of content that promotes white supremacy, regardless of whether it includes slurs.
It seems fair to assume that YouTube would reject this notion out of hand. (The criticism would start with “it doesn’t scale” and go from there.) But there are certainly smaller steps YouTube could take in the meantime. Lewis notes the glee with which one conservative provocateur received his plaque for attracting 1 million subscribers; surely, she writs, the company could choose to withhold trophies from people arguing against equality or targeting harassment at others.
In the meantime, I hope YouTube employees will at least read this report, if only to understand how some of its most influential users are exploiting its viral mechanics to promote white supremacy and other noxious views.

Democracy
Polarization in Poland: A Warning From Europe
Bertelsmann to Merge Unit That Moderates for Facebook With a Competitor
Instagram will use ads to help users register to vote
Mark Zuckerberg on Why We Should Support the Dreamers
Elsewhere
ACLU says Facebook allowed discriminatory job ads that didn’t appear to women - The Verge
Facebook and Financial Firms Tussled for Years Over Access to User Data
Facebook Seeks Engineers for Custom AR Chips ($)
Instagram could still develop a new shopping app — but here’s how it’s trying to woo window-shoppers in its current one
Why PayPal’s crackdown on ASMR creators should worry you
A Viral Tweet Stole Fetish Model's Photos, Told Fake Domestic Abuse Story to Sell Skinny Tea
Palmer Luckey Is Just Getting Started
Jack Dorsey on ProPublica's Experimental Journalism
Launches
Twitter will soon let you switch between chronological and ranked feeds
If You See Disinformation Ahead of the Midterms, We Want to Hear From You
HQ Trivia looks to expand with HQ Words, a new Wheel of Fortune-style game
YouTube is offering its membership benefits to smaller creators
Pinterest launches API that lets brands find and track influencers
iPhone XS review: the XS and XS Max are solid updates to a winning formula
Takes
Infowarzel
A New Twitter Feature: Smart Accounts
Updates
Media Manipulation, Strategic Amplification, and Responsible Journalism
Facebook Says This Post About A Firing Squad For A Philippine Senator Doesn't Violate Its Rules
And finally ...
Elon Musk recruits Dogecoin creator to fight cryptocurrency scambots
Talk to me
Send me tips, questions, comments, corrections, and radicalizing videos: casey@theverge.com.
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue