View profile

How social networks turn teens into extremists

Revue
 
In March, in the aftermath of the Christchurch shooting, I tried to distinguish between internet prob
 
May 14 · Issue #329 · View online
The Interface
In March, in the aftermath of the Christchurch shooting, I tried to distinguish between internet problems and platform problems. Internet problems arise from the existence a free and open network that connects most of the world; platform problems arise from features native to the platform. The fact that anti-vaccination zealots can meet online is an internet problem; the fact that Facebook recommended that new mothers join anti-vaccination groups is a platform problem.
The recent rise in white supremacist violence around the world has given us fresh reason to ask which aspects of the problem belong to the entire internet, and which belong to our biggest social networks. It seems apparent that the internet is cultivating loose but potent networks of extremists. But what are the mechanics of this radicalization? And what role could platforms play in discouraging it?
I thought about that question while reading Joe Bernstein’s unsettling piece about Soph, a 14-year-old YouTuber who has gained a measure of fame (and 800,000 subscribers) by preaching a slur-laden gospel of homophobia, Islamophobia, and racism.
Bernstein’s profile paints a picture of a child who gained notoriety as a foul-mouthed 9-year-old broadcasting herself playing video games. The more outrageous her behavior, the more YouTube’s algorithms rewarded her with attention, until she was making $1,700 a month from Patreon subscribers and feeling comfortable enough to make death threats against YouTube’s CEO:
Last month, after YouTube deactivated comments on her videos — the platform disabled comments on all videos with children in response to an outcry over the aforementioned network of exploitation — Soph uploaded a 12-minute video in which she seemed to be daring the platform to suspend her, knowing full well that it wouldn’t.
“Susan, I’ve known your address since last summer,” Soph said, directly addressing YouTube CEO Susan Wojcicki. “I’ve got a Luger and a mitochondrial disease. I don’t care if I live. Why should I care if you live or your children? I just called an Uber. You’ve got about seven minutes to draft up a will. … I’m coming for you, and it ain’t gonna be pretty.”
By mid-afternoon on Tuesday, the video had been removed, and Soph’s channel suspended. In part, channels like Soph’s seem inevitable — offer everyone in the world a microphone, and offer the biggest rewards to those who get our attention in the most novel ways, and some of your creators are going to break bad.
On the other hand, Bernstein’s profile carries with it a poignant sense that Soph is not well. She has an illness, she’s unhappy in school, and she feels alone. There’s nothing wrong with feeling that way, or with seeking comfort in an online audience of friends and strangers. And it’s fair to ask where Soph’s parents are in all this.
But YouTube allows children to start channels as young as 13 — and Soph was apparently active on her channel at the age of 9. As Bernstein writes: “Soph’s popularity raises another, perhaps more difficult question, about whether YouTube has an obligation to protect such users from themselves — and one another.” Put another way: a child who becomes a hero to bigots because of her performances on the platform, and who is recommended to other users by its algorithm, is a platform problem.
Bernstein’s profile of Soph recalls another recent story about a teenager’s embrace of the alt-right. In this month’s issue of the Washingtonian, an anonymous parent recalls the experience of their child’s gradual radicalization after being falsely accused of sexual harassment. Reddit and 4Chan were happy to tell 13-year-old Sam what he wanted to hear, the author writes:
Those online pals were happy to explain that all girls lie—especially about rape. And they had lots more knowledge to impart. They told Sam that Islam is an inherently violent religion and that Jews run global financial networks. (We’re Jewish and don’t know anyone who runs anything, but I guess the evidence was convincing.) They insisted that the wage gap is a fallacy, that feminazis are destroying families, that people need guns to protect themselves from government incursions onto private property. They declared that women who abort their babies should be jailed.
Sam prides himself on questioning conventional wisdom and subjecting claims to intellectual scrutiny. For kids today, that means Googling stuff. One might think these searches would turn up a variety of perspectives, including at least a few compelling counterarguments. One would be wrong. The Google searches flooded his developing brain with endless bias-confirming “proof” to back up whichever specious alt-right standard was being hoisted that week. Each set of results acted like fertilizer sprinkled on weeds: A forest of distortion flourished.
His parents attempt to reason with him, to no avail. Sam becomes a moderator of a right-wing Reddit forum, and his parents begin questioning their own reality:
One weekend morning as we were folding laundry in our room, Sam sat on the edge of our bed and instructed us on how to behave if the FBI ever appeared at our door.
What was posturing and what was real? We suspected the former and doubted the latter, but we had no way to be sure. The situation evolved faster than we could frame the questions, much less figure out the answers. When we did confront Sam—say, if we caught a glimpse of a vile meme on his phone—he assured us that it was meant to be funny and that we didn’t get it. It was either “post-ironic” or referenced multiple other events that created a maze-like series of in-jokes impossible for us to follow.
What finally snaps Sam out of it, in his parent’s telling, is visiting a pro-Trump rally in 2017. There he sees a lone counter-protester holding up a picture of Heather Heyer, the demonstrator murdered at a white supremacist rally in Charlottesville, and Sam marvels at the counter-protester’s bravery. He later tells his parents he feels as if he had been hostage to a cult.
Sam’s story is ultimately a hopeful one, because it shows a path away from right-wing radicalization. It takes time, skillful parenting, and a capacity for self-reflection on the part of young people like Sam. But young people can and very much do shed old identities as they grow up.
That said: not everyone has skillful parents or the capacity for self-reflection. Not everyone is 14 years old and in the middle of an experimental phase — the alleged Christchurch shooter was 28. Some of the extremists who are nurtured on these platforms never come back.
The project of depolarizing the globe, and pushing extremists back to the margins, will require far more than software fixes and policy updates. But as I think about YouTubers like Soph, I hope those platforms are doing the same kind of self-reflection that Sam did. Coming in after the bigot gets 800,000 followers, and removing her hate videos after they have gone viral, should only be the first step. The more urgent question question is why Soph found such a rapt audience — and how YouTube helped her build it.

Democracy
DeSantis: Russians accessed 2 Florida voting databases
Facebook facing 20-year consent agreement after privacy lapses: source
40% of Americans support antitrust action against Facebook after a cofounder called for it to be broken up
Google’s Censored Search Would Help China “Be More Open,” Said Ex-CEO Eric Schmidt
Elizabeth Warren Turns Down Fox News Town Hall, Calling the Network a ‘Hate-for-Profit Racket’
After year-long lockout, Twitter is finally giving people their accounts back
California is bringing law and order to big data. It could change the internet in the U.S.
How Silicon Valley’s successes are fueled by an underclass of ‘ghost workers’ - The Verge
Elsewhere
Update WhatsApp now to avoid spyware installation from a single missed call
Neighborhood social network Nextdoor raises $123 million at $2.1 billion valuation
Facebook Warns Advertisers `Clear History' May Hurt Targeting
One video led to YouTuber James Charles losing two million subscribers
People say they care about privacy but buy devices that can spy on them anyway
Why People Fake Cancer Online
Teens Smuggle Burner Phones to Defy Parents
TikTok and Tariffs
Launches
Twitter fights vaccine misinformation with new search tool
Facebook reenables ‘View as Public’ feature following 2018 security issue
Twitter’s new Developer Labs offers beta access to rebuilt APIs
Match app adds an offline dating coach for your online dating woes
Takes
Let’s Not Put the Government in Charge of Moderating Facebook
Group Chats Are Making the Internet Fun Again
And finally ...
It’s Time to Break up Pinterest
Talk to me
Send me tips, comments, questions, and de-radicalization strategies: casey@theverge.com.
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue