The Interface

By Casey Newton

What YouTube could teach Facebook about conspiracies



Subscribe to our newsletter

By subscribing, you agree with Revue’s Terms of Service and Privacy Policy and understand that The Interface will receive your email address.

July 23 · Issue #170 · View online
The Interface
Months before critics revisited Facebook’s embrace of Holocaust deniers and other conspiracy peddlers, YouTube faced similar pressures. In February, a Wall Street Journal investigation found that Google’s video-sharing site routinely pushed users to misinformation or hyper-partisan content through its automated recommendations. In a widely follow-up in the New York Times, Zeynep Tufekci called Google’s video-sharing site “the great radicalizer.”
Like Facebook, Google is loath to declare any topic off-limits to its user base. And so at South By Southwest, YouTube CEO Susan Wojcicki unveiled a potential solution: “information cues,” a companion product for conspiracy videos that offers users additional, non-crazy viewpoints about subjects like the moon landing and chemtrails. It began rolling out just within the past two weeks, and the company does not yet have data to share about how it’s working, a YouTube spokeswoman said.
Could a similar approach work for Facebook? Writing in The Atlantic, Yair Rosenberg suggests that the company try it.
Take the Facebook page of the “Committee for Open Debate on the Holocaust,” a long-standing Holocaust-denial front. For years, the page has operated without any objection from Facebook, just as Zuckerberg acknowledged in his interview. Now, imagine if instead of taking it down, Facebook appended a prominent disclaimer atop the page: “This page promotes the denial of the Holocaust, the systematic 20th-century attempt to exterminate the Jewish people which left 6 million of them dead, alongside millions of political dissidents, LGBT people, and others the Nazis considered undesirable. To learn more about this history and not be misled by propaganda, visit these links to our partners at the United State Holocaust Museum and Israel’s Yad Vashem.”
Obviously, this intervention would not deter a hardened Holocaust denier, but it would prevent the vast majority of normal readers who might stumble across the page and its innocuous name from being taken in. A page meant to promote anti-Semitism and misinformation would be turned into an educational tool against both. The same could easily be done for pages and posts promoting conspiracy theories ranging from 9/11 trutherism to Islamophobic obsessions with impending Sharia law, working with partners ranging from the NAACP to the Anti-Defamation League to craft relevant responses and source materials.
It’s an appealing pitch, one that seeks a middle ground between free-speech absolutism and passive promotion of calls for violence. But measuring how effective it is will be difficult.
Platforms can’t read users’ minds, and it’s impossible to determine whether truthful context added to conspiracy content limits the spread of noxious ideas. In cases where platforms link to external sources, as YouTube does to Encyclopedia Britannica and Wikipedia, YouTube measures the percentage of viewers who click those links. An earlier set of information cues that YouTube introduced to highlight when a channel is paid for by a government has seen high clickthrough rates, a YouTube spokeswoman said.
In the meantime, we can simply examine the text shared with viewers in these information cues. Logged out of YouTube, I ran a search for “moon landing faked.” The first video was from — disappointingly — BuzzFeed’s Blue channel, which features an otherwise unidentified guy named Matt confessing his “unpopular opinion” that the moon landing never happened. The video was terrible, and after it finished, auto-play led me directly to a video alleging that NASA “admitted” the landing was faked. (NASA has never done any such thing.)
Both the search results and individual videos came with prominent information cues: in the former case, above the results; and in the latter, directly under the video player. Here’s the text that appears:
ApolloMoon-landing project conducted by the U.S. National Aeronautics and Space Administration in the 1960s and ’70s. The Apollo program was announced in May 1961, but the choice among competing techniques for achieving a Moon landing and return was not resolved until considerable further study. In the method ultimately employed, a powerful launch vehicle (Saturn V rocket) placed a 50-ton spacecraft in a lunar trajectory. Several Saturn launch vehicles and accompanying spacecraft were built. The Apollo spacecraft were supplied …
Notably, the relevant information here — that the moon landing actually happened — is not contained in the information cue itself. The gray box offers no hint that it is designed to serve as a counterweight to the content of the video. It feels tentative — halfhearted, even. The only thing you learn from reading the text in the box is that there was a research program aimed landing on the moon, something even the conspiracists don’t deny.
Clicking anywhere on the text takes you to a full page about the Apollo program on Encylopedia Brittanica, which briefly describes the Apollo 11 moon landing at the end of its second paragraph. Conspiracy theories about the landing are not addressed. The presentation highlights a limitation of using third parties to rebut conspiracy theories: encyclopedia articles generally are not written, first and foremost, to rebut other writing. Their plainspoken style can often bury the lede.
In his piece on Facebook and Holocaust denial, Rosenberg calls for more explicit editorializing. (“This page promotes the denial of the Holocaust, the systematic 20th-century attempt to exterminate the Jewish people which left 6 million of them dead.”) That would likely represent an uncomfortable degree of editorial intervention at Facebook. Certainly, it would be unprecedented. But if Facebook wants to both host misinformation and prevent it from spreading, the company may have no other choice.

The Fact-Checkers Who Want to Save the World
Russia, Accused of Faking News, Unfurls Its Own ‘Fake News’ Bill - The New York Times
Egypt targets social media with new law
How did Russian agents use Twitter at key moments in the 2016 election?
A global guide to state-sponsored trolling
Snapchat will shut down Snapcash, forfeiting to Venmo
Pinterest nearing $1 billion in ad revenue as it plans to IPO mid-2019
St. Louis Uber driver has put video of hundreds of passengers online. Most have no idea.
Maggie Haberman: Why I Needed to Pull Back From Twitter
YouTube is testing its own ‘Explore’ tab on iPhone
Mark Zuckerberg is a horror show. But there’s a glimmer of truth hidden in his latest blunder. - The Washington Post
The AskHistorians subreddit banned Holocaust deniers, and Facebook should too.
And finally ...
Alex Jones compares himself to Woodward and Bernstein in move to dismiss Sandy Hook lawsuit
Talk to me
Questions? Comments? Fact checks?
Did you enjoy this issue?
In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue