|
|
July 23 · Issue #170 · View online |
|
Months before critics revisited Facebook’s embrace of Holocaust deniers and other conspiracy peddlers, YouTube faced similar pressures. In February, a Wall Street Journal investigation found that Google’s video-sharing site routinely pushed users to misinformation or hyper-partisan content through its automated recommendations. In a widely follow-up in the New York Times, Zeynep Tufekci called Google’s video-sharing site “ the great radicalizer.” Like Facebook, Google is loath to declare any topic off-limits to its user base. And so at South By Southwest, YouTube CEO Susan Wojcicki unveiled a potential solution: “information cues,” a companion product for conspiracy videos that offers users additional, non-crazy viewpoints about subjects like the moon landing and chemtrails. It began rolling out just within the past two weeks, and the company does not yet have data to share about how it’s working, a YouTube spokeswoman said. Could a similar approach work for Facebook? Writing in The Atlantic, Yair Rosenberg suggests that the company try it. Take the Facebook page of the “ Committee for Open Debate on the Holocaust,” a long-standing Holocaust-denial front. For years, the page has operated without any objection from Facebook, just as Zuckerberg acknowledged in his interview. Now, imagine if instead of taking it down, Facebook appended a prominent disclaimer atop the page: “This page promotes the denial of the Holocaust, the systematic 20th-century attempt to exterminate the Jewish people which left 6 million of them dead, alongside millions of political dissidents, LGBT people, and others the Nazis considered undesirable. To learn more about this history and not be misled by propaganda, visit these links to our partners at the United State Holocaust Museum and Israel’s Yad Vashem.” Obviously, this intervention would not deter a hardened Holocaust denier, but it would prevent the vast majority of normal readers who might stumble across the page and its innocuous name from being taken in. A page meant to promote anti-Semitism and misinformation would be turned into an educational tool against both. The same could easily be done for pages and posts promoting conspiracy theories ranging from 9/11 trutherism to Islamophobic obsessions with impending Sharia law, working with partners ranging from the NAACP to the Anti-Defamation League to craft relevant responses and source materials. It’s an appealing pitch, one that seeks a middle ground between free-speech absolutism and passive promotion of calls for violence. But measuring how effective it is will be difficult. Platforms can’t read users’ minds, and it’s impossible to determine whether truthful context added to conspiracy content limits the spread of noxious ideas. In cases where platforms link to external sources, as YouTube does to Encyclopedia Britannica and Wikipedia, YouTube measures the percentage of viewers who click those links. An earlier set of information cues that YouTube introduced to highlight when a channel is paid for by a government has seen high clickthrough rates, a YouTube spokeswoman said. In the meantime, we can simply examine the text shared with viewers in these information cues. Logged out of YouTube, I ran a search for “moon landing faked.” The first video was from — disappointingly — BuzzFeed’s Blue channel, which features an otherwise unidentified guy named Matt confessing his “unpopular opinion” that the moon landing never happened. The video was terrible, and after it finished, auto-play led me directly to a video alleging that NASA “admitted” the landing was faked. (NASA has never done any such thing.) Both the search results and individual videos came with prominent information cues: in the former case, above the results; and in the latter, directly under the video player. Here’s the text that appears:
Apollo, Moon-landing project conducted by the U.S. National Aeronautics and Space Administration in the 1960s and ’70s. The Apollo program was announced in May 1961, but the choice among competing techniques for achieving a Moon landing and return was not resolved until considerable further study. In the method ultimately employed, a powerful launch vehicle ( Saturn V rocket) placed a 50-ton spacecraft in a lunar trajectory. Several Saturn launch vehicles and accompanying spacecraft were built. The Apollo spacecraft were supplied … Notably, the relevant information here — that the moon landing actually happened — is not contained in the information cue itself. The gray box offers no hint that it is designed to serve as a counterweight to the content of the video. It feels tentative — halfhearted, even. The only thing you learn from reading the text in the box is that there was a research program aimed landing on the moon, something even the conspiracists don’t deny. Clicking anywhere on the text takes you to a full page about the Apollo program on Encylopedia Brittanica, which briefly describes the Apollo 11 moon landing at the end of its second paragraph. Conspiracy theories about the landing are not addressed. The presentation highlights a limitation of using third parties to rebut conspiracy theories: encyclopedia articles generally are not written, first and foremost, to rebut other writing. Their plainspoken style can often bury the lede. In his piece on Facebook and Holocaust denial, Rosenberg calls for more explicit editorializing. (“This page promotes the denial of the Holocaust, the systematic 20th-century attempt to exterminate the Jewish people which left 6 million of them dead.”) That would likely represent an uncomfortable degree of editorial intervention at Facebook. Certainly, it would be unprecedented. But if Facebook wants to both host misinformation and prevent it from spreading, the company may have no other choice.
|
|
|
The Fact-Checkers Who Want to Save the World
Kate Knibbs writes that the fact-checking industry is growing but still inadequate to the task of policing digital media: While fact-checking organizations originally sprang up as attempted antidotes to political misinformation and hoaxes, their role has ballooned into ad hoc and woefully incomplete corrections departments for the digital world. Some major fact-checking organizations have entered into asymmetrical relationships with big platforms, which means their efforts at debunking misinformation rely on the same social networks responsible for spreading misinformation. The end result is maddening for anyone trying to figure out where to find trustworthy information. The rise of fact-checking has not resulted in a more orderly or easy-to-understand internet. Right now, fact-checkers fighting lies online resemble volunteer firefighters equipped with pails of water to fight a five-alarm blaze.
|
Russia, Accused of Faking News, Unfurls Its Own ‘Fake News’ Bill - The New York Times
Repressive state using “fake news” as an excuse to suppress an independent media, part one: The bill, submitted by lawmakers from the governing party, United Russia, proposes holding social networks accountable for “inaccurate” comments users post. Under existing Russian law, social media users can be punished for content deemed to promote homosexuality, to threaten public order or to be “extremist” in nature, with fines as well as prison time. Under the proposed rule, part of a creeping crackdown on digital rights under President Vladimir V. Putin, websites with more than 100,000 daily visitors and a commenting feature must take down factually inaccurate posts or face a fine of up to 50 million rubles, about $800,000.
|
Egypt targets social media with new law
Repressive state using “fake news” as an excuse to suppress an independent media, part two: Under the law passed on Monday social media accounts and blogs with more than 5,000 followers on sites such as Twitter and Facebook will be treated as media outlets, which makes them subject to prosecution for publishing false news or incitement to break the law. The Supreme Council for the Administration of the Media, headed by an official appointed by President Abdel Fattah al-Sisi, will supervise the law and take action against violations.
|
How did Russian agents use Twitter at key moments in the 2016 election?
Craig Timberg and Shane Harris report on a Clemson University project to tie flurries of Russian agent tweets to specific events during the 2016 campaign: These questions flow from the work of a pair of Clemson University researchers who have assembled the largest trove of Russian disinformation tweets available so far. The database includes tweets between February 2014 and May 2018, all from accounts that Twitter has identified as part of the disinformation campaign waged by the Internet Research Agency, based in St. Petersburg and owned by an associate of Russian President Vladimir Putin. Collectively the new data offer yet more evidence of the coordinated nature of Russia’s attempt to manipulate the American election. The Clemson researchers dubbed it “state-sponsored agenda building.”
|
A global guide to state-sponsored trolling
Michael Riley, Lauren Etter, and Bibhudatta Pradhan take a comprehensive and disturbing look at how states are using platforms to crack down on dissent and protect their own power. Here’s an amazing sentence: “In Venezuela, prospective trolls sign up for Twitter and Instagram accounts at government-sanctioned kiosks in town squares and are rewarded for their participation with access to scarce food coupons.” Only a few years after Twitter and Facebook were celebrated as the spark for democratic movements worldwide, states and their proxies are hatching new forms of digitally enabled suppression that were unthinkable before the age of the social media giants, according to evidence collected from computer sleuths, researchers and documents across more than a dozen countries. Combining virtual hate mobs, surveillance, misinformation, anonymous threats, and the invasion of victims’ privacy, states and political parties around the globe have created an increasingly aggressive online playbook that is difficult for the platforms to detect or counter.
|
|
Snapchat will shut down Snapcash, forfeiting to Venmo
On one hand, it seems strange to kill a payments product amidst Snap’s broader expansion into e-commerce. On the other, I assume Snap wants to own the entire payments flow, and getting rid of Square as a partner could be a first step toward that.
|
Pinterest nearing $1 billion in ad revenue as it plans to IPO mid-2019
Pinterest — or Pinterest investors — leak some good news about the company on its way to an IPO:
Pinterest has taken a long time to justify its monstrous private market valuation. However the social media company is finally approaching $1 billion in ad revenue as it pushes toward an IPO in mid-2019, according to people familiar with the matter. After hitting $500 million in sales in 2017, Pinterest is on pace to almost double that this year, said the people, who asked not to be named because the company’s financials are private. Pinterest is having particular success with mobile ads, as the site becomes a more popular place for big fashion and beauty brands to get in front of the company’s 200 million monthly active users.
|
St. Louis Uber driver has put video of hundreds of passengers online. Most have no idea.
Here is a dystopian story about an Uber driver who made extra money by broadcasting his passengers’ rides on Twitch without their consent. He was banned from every relevant service the next day.
|
Maggie Haberman: Why I Needed to Pull Back From Twitter
Twitter is a video game where you collect as many followers as possible. Collecting too many breaks the game, though. So a lot of Twitter winners wind up quitting: In the more than 20 months since President Trump was elected, I have gained close to 700,000 Twitter followers. I consider myself fortunate to have had such a broad audience. I mostly enjoyed being able to interact with readers and suspect I will again someday. It just won’t be soon.
|
|
YouTube is testing its own ‘Explore’ tab on iPhone
I read these paragraphs and still have basically no idea what the Explore tab is: The idea behind Explore is to offer YouTube viewers a wider variety of what-to-watch suggestions than they receive today. Currently, personalized video recommendations are very much influenced by past viewing activity and other behavior, which can then create a sort of homogenous selection of recommended content. “Explore is designed to help you be exposed to different kinds of topics, videos or channels that you might not otherwise encounter, but they’re still personalized,” said Tom Leung, Director of Product Management, in a YouTube video.
|
|
Mark Zuckerberg is a horror show. But there’s a glimmer of truth hidden in his latest blunder. - The Washington Post
Margaret Sullivan tries to find a middle path through Facebook’s stance on Holocaust denialism: More simply, Facebook could recognize that Holocaust denialism is hate speech, and forbid it on those grounds. But widespread censorship and chasing down every falsehood ought to be acknowledged as bad ideas. Vera Eidelman, an American Civil Liberties Union fellow, rightly noted last week that already marginalized voices — activists of color, for example — are likely to be silenced first if Facebook expands its censorship powers. Zuckerberg got that much right, though in a remarkably misguided way.
|
The AskHistorians subreddit banned Holocaust deniers, and Facebook should too.
Johannes Breit on how Reddit’s AskHistorians deals with Holocaust deniers — and why all such denialism represents an incitement to violence: Conversation is impossible if one side refuses to acknowledge the basic premise that facts are facts. This is why engaging deniers in such an effort means having already lost. And it is why AskHistorians, where I am one of the volunteer moderators, takes a strict stance on Holocaust denial: We ban it immediately. Deniers need a public forum to spread their lies and to sow doubt among readers not well-informed about history. By convincing people that they might have a point or two, they open the door for further radicalization in pursuit of their ultimate goal: to rehabilitate Nazism as an ideology in public discourse by distancing it from the key elements that make it so rightfully reviled—the genocide against Jews, Roma, Sinti, and others. Clarifying, as Zuckerberg later did, that Facebook would remove posts for “advocating violence” will never be effective for a simple reason. Any attempt to make Nazism palatable again is a call for violence. More than 11 million victims prove that. Because Holocaust deniers want and need a platform to reach this goal, it is imperative to deny it to them, as an institution, a newspaper, or a social media forum.
|
|
Alex Jones compares himself to Woodward and Bernstein in move to dismiss Sandy Hook lawsuit
Infowars’ Alex Jones has claimed the mantle of Watergate heroes Woodward and Bernstein in defending himself against a defamation case. But even more surprising is the nugget presented at the end here: Jones now believes Sandy Hook happened. Right-wing radio host and conspiracy theorist Alex Jones argued he was acting as a journalist, comparing himself to the Washington Post reporters who uncovered the Watergate scandal, when he questioned on his talk show “Infowars” the official narrative given by officials in the 2012 Sandy Hook school shooting. Presumably he will now convey that message to his followers and actively monitor them in hopes that they will stop harassing families of the victims.
|
|
Questions? Comments? Fact checks? casey@theverge.com
|
Did you enjoy this issue?
|
|
|
|
In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.
|
|
|
|
|
|