View profile

YouTube brings new life to Flat Earthers

Revue
 
It's been a big month for conspiracy theories. Last week, Rep. Adam Schiff sent a strongly worded let
 
February 19 · Issue #292 · View online
The Interface
It’s been a big month for conspiracy theories. Last week, Rep. Adam Schiff sent a strongly worded letter to Google and Facebook about the way their platforms recommend anti-vaccination content to parents, potentially putting healthy populations at risk. This week, new reports are taking a deeper look the unintended consequences of YouTube recommendations, starting with the conspiracy theory that the earth is flat.
In the Guardian, Ian Sample has news of a recent presentation at the American Association for the Advancement of Science, in which a researcher from Texas Tech University discussed her findings from interviewing 30 participants in a recent convention of Flat Earthers. The takeaway: YouTube brought them there, Sample says:
Of the 30, all but one said they had not considered the Earth to be flat two years ago but changed their minds after watching videos promoting conspiracy theories on YouTube. “The only person who didn’t say this was there with his daughter and his son-in-law and they had seen it on YouTube and told him about it,” said Asheley Landrum, who led the research at Texas Tech University.
The interviews revealed that most had been watching videos about other conspiracies, with alternative takes on 9/11, the Sandy Hook school shooting and whether Nasa really went to the moon, when YouTube offered up Flat Earth videos for them to watch next.
Flat Earth videos could fit the definition of “borderline content,” which YouTube said last month it would stop recommending to users as suggested next videos to watch. (YouTube also rolled out new disciplinary procedures today.) But as Kevin Roose writes in a sharp column for the New York Times, YouTube’s efforts may be thwarted by the fact that many of its most popular creators are thriving precisely because they create borderline content. Now that they have tens of millions of followers, how much does changing the recommendation algorithm really matter?
Roose focuses on star YouTuber Shane Dawson, who has 20 million followers and recently posted a smash hit, 104-minute documentary promoting various conspiracy theories. (Dawson has previously said, of the Flat Earth theory, that it “kind of makes sense”.)
Innocent or not, Mr. Dawson’s videos contain precisely the type of viral misinformation that YouTube now says it wants to limit. And its effort raises an uncomfortable question: What if stemming the tide of misinformation on YouTube means punishing some of the platform’s biggest stars? […]
Part of the problem for platforms like YouTube and Facebook — which has also pledged to clean up misinformation that could lead to real-world harm — is that the definition of “harmful” misinformation is circular. There is no inherent reason that a video questioning the official 9/11 narrative is more dangerous than a video asserting the existence of U.F.O.s or Bigfoot. A conspiracy theory is harmful if it results in harm — at which point it’s often too late for platforms to act.
What makes this phenomenon insidious is that it can become dangerous even when no one even believes the conspiracy theory being floated, at least not initially. In 2015, The Onion published a satirical op-ed by an infant who claimed he wanted to eat “one of those multicolored detergent pods.” This was followed in 2017 by a satirical video from CollegeHumor titled “Don’t Eat The Laundry Pods. Seriously (They’re Poison.)”
If you were online at all last year, you probably know what’s coming next. From the Washington Post’s history of the Tide Pods challenge:
Last year, U.S. poison control centers received reports of more than 10,500 children younger than 5 who were exposed to the capsules. The same year, nearly 220 teens were reportedly exposed, and about 25 percent of those cases were intentional, according to data from the American Association of Poison Control Centers.
So far in 2018, there have been 37 reported cases among teenagers — half of them intentional, according to the data.
The Tide Pods challenge was just a joke, until it wasn’t. Until a certain point, videos about it were pure entertainment. So when did that change? Imagine that you’re working at YouTube. When do you flip the switch declaring the whole subject to be “borderline content”?
I don’t think this question is unanswerable. There was almost certainly a moment in the evolution of the Tide Pods story where it became clear that it had taken on a life of its own. But determining that moment in real time would require platforms to take on more of an editorial role than they have historically been comfortable with. (Forcing platforms to take such a role is, incidentally, one of the chief recommendations in the UK Parliament committee report I covered here yesterday.)
Generally I find it tedious when reporters slag platforms for “not admitting they’re a media company.” “Media company” is not a legal definition, after all, and Facebook has already acknowledged that it bears responsibility for what users post.
And so I don’t care whether tech platforms identify as media companies. But when it comes to policing the conspiracy theories that they help flourish, I do wish that they would act like media companies. When the next Tide Pods challenge arrives — and it will — a little editorial intervention could go a long way.

Democracy
FTC complaint accuses Facebook of revealing sensitive health data in groups
Expanding transparency around political ads on Twitter
What Happens When Techno-Utopians Actually Run a Country
Russia’s Network of Millennial Media
Emoji are showing up in court cases exponentially, and courts aren’t prepared
Elsewhere
Even Without Amazon, Tech Could Keep Gaining Ground in New York
Even years later, Twitter doesn’t delete your direct messages
How loot boxes hooked gamers and left regulators spinning
Launches
Slack off. Send videos instead with $11M-funded Loom
Takes
Global Britain can lead the world in confronting the dark side of big tech
The Pentagon Needs to Woo AI Experts Away From Big Tech
And finally ...
Instagram posts land former Trump confidant into deeper legal trouble
Talk to me
Send me tips, comments, questions, and evidence that the earth is round: casey@theverge.com.
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue