View profile

What to do about the fake Pelosi video

Revue
 
I. On Friday, the Washington Post reported that a video purporting to show House Speaker Nancy Pelos
 
May 28 · Issue #333 · View online
The Interface
I.
On Friday, the Washington Post reported that a video purporting to show House Speaker Nancy Pelosi slurring her words was racking up millions of views and shares on social networks, with Facebook leading the way on engagement. In reality, the (still unknown) creator of the video had slowed footage of Pelosi to 75 percent the speed of the original, while adjusting the pitch of her voice to make it sound more natural. The result was catnip for conservative partisans eager to paint the congresswoman as a drunken buffoon.
The video’s rapid spread around the internet sparked new fears that our politics were on the cusp of being radically and irreversibly changed by the introduction of digitally altered propaganda. Over the weekend, the situation generated an extraordinary amount of commentary — on what it suggests about our future, and on what social networks should do about it.
Facebook ran its standard misinformation playbook, labeling the video as false and offering anyone who tried to share the video an opaque pseudo-warning letting the user know that there is “additional reporting available.” Monika Bickert, who is in charge of policy at Facebook, went on Anderson Cooper 360 to defend this approach.
Cooper asked Bickert why Facebook kept the video up. As Ian Bogost recounts in The Atlantic:
This line of thinking seemed to perplex Cooper, and rightly so. Why would an immediate impact, such as inciting violence in an acute conflict, be wrong, but a deferred impact, such as harming the reputation of the woman who’s third in line for the presidency, be okay?
Once the content exists, Bickert implied, the company supports it as a tool to engender more content. “The conversation on Facebook, on Twitter, offline as well, is about the video being manipulated,” Bickert responded, “as evidenced by my appearance today. This is the conversation.” The purpose of content is not to be true or false, wrong or right, virtuous or wicked, ugly or beautiful. No, content’s purpose is to exist, and in so doing, to inspire “conversation”—that is, ever more content.
Meanwhile, Axios said the video had ushered in “our sad, new, distorted reality.” Charlie Warzel said Facebook had become a perfect machine for hijacking our attention. Kara Swisher said the incident shows “how expert Facebook has become at blurring the lines between simple mistakes and deliberate deception, thereby abrogating its responsibility as the key distributor of news on the planet.” Joshua Topolsky encouraged people to delete Facebook until it becomes willing to make editorial judgment calls.
And on the opposite side, commentators worried about a world in which platforms make editorial decisions with no recourse available to those whose speech is deemed out of bounds. “A lot of the commentary about the Pelosi video is ’not even wrong’, as it does not put forward any consistent or realistic enforcement standard other than ‘take down stuff I don’t like,’” said Alex Stamos.
II.
While all of this was playing out, a Bay Area TV station KTVU reported on the story of Kate Kretz, an artist who sews Make America Great Again hats into hate speech symbols, such as a Ku Klux Klan hood or Nazi armband. Kretz’s work is intended as a protest of the Trump Administration’s racist policies, but earlier this month, Facebook removed her work for violating its community guidelines against hate speech:
In early May, Facebook removed Kretz’ images of her latest work for violating community standards. The artist protested, re-uploaded her images, but this time with a disclaimer stating that her art was not hate speech, and in fact was commentary on hate speech, much like a political cartoon.
Then Facebook disabled her account. 
In both the Pelosi and the Kretz cases, we find people altering artifacts of political speech in an effort to influence our politics. Both are protected under the First Amendment. Whether they are protected under Facebook’s community guidelines is more debatable. The spirit of Facebook’s rules would seem to exclude a distorted propaganda video, and to include photos of some fairly literal political art. But in practice, Facebook made the opposite judgment.
The reason is that, for all of the consequences it has on politics, Facebook is determined to stay above the fray. (Or maybe right next to the fray, where people might more easily post about it on Facebook.) The company doesn’t understand the difference between a propaganda video and a piece of art because, in a very serious way, it does not want to. To understand would be to take on expensive new responsibilities, and open itself up to new lines of political attack, at a time when it faces significant new regulatory threats around the world.
Among top Facebook executives, this posture of strained neutrality is the only one that feels possible, whatever brickbats it may face in the press as a result. A policy that enables the maximum amount of political speech, save for a small number of exemptions outlined in a publicly posted document, has a logical coherence that “take down stuff I don’t like” does not.
That’s one reason why the take-it-down brigade might consider developing an alternate set of Facebook community standards for public consideration. I have no doubt that there are better ways to draw the boundaries here — to swiftly purge malicious propaganda, while promoting what is plainly art. But someone has to draw those boundaries, and defend them.
Alternatively, you could break Facebook up into its constituent parts, and let the resulting Baby Books experiment with standards of their own. Perhaps WhatsApp, stripped of all viral forwarding mechanics, would find a slowed-down Pelosi video acceptable when shared from one friend to another. Meanwhile Instagram would rapidly detect the video’s surging popularity and ensure that nothing like it appeared on the app’s Explore page, where the company could unwittingly aid in its distribution the way Facebook’s News Feed algorithm did this time around. Making communities smaller can make it easier to craft rules that fit them.
In the meantime, TED’s Alexios Mantzarlis offers four good suggestions for Facebook to implement, which I’d like to echo here in my own words. It should act faster — if centralization is the company’s big virtue, it should use that power to detect videos like these and apply fact-checking resources before they get millions of views. Two, it should write its warning pop-ups in plain English. Say goodbye to “additional reporting is available,” and hello to “this video has been distorted to change its meaning.” Three, follow up users with videos who shared the post before you identified it as fake, and offer them the chance to un-share it. And finally, share more data with the public and with researchers on the effectiveness of fact-checking.
I don’t think the Pelosi video heralds the end times for our information sphere. But I do think that debates like this, over what Facebook leaves up and what it takes down, are only going to grow more fractious as bad actors find new ways to hijack our attention. I understand why Facebook wants to avoid making editorial judgments on political videos. But doing nothing is an editorial judgment, too — and one that social platforms are increasingly going to be held to account for.

Democracy
Facebook and Twitter disable new disinformation campaign with ties to Iran
Facebook facing most probes by Irish data regulator
GDPR After One Year: Costs and Unintended Consequences
Facebook's Zuckerberg ignores subpoena from Canadian parliament, risks being held in contempt
What I Learned Trying To Secure Congressional Campaigns (Idle Words)
The Video Game PUBG Went Viral Across India. Then Police Started Arresting Its Young Players.
How China Uses High-Tech Surveillance to Subdue Minorities
China's robot censors crank up as Tiananmen anniversary nears
Behind Grindr's doomed hookup in China, a data misstep and scramble to make up
Elsewhere
Google’s Shadow Work Force: Temps Who Outnumber Full-Time Employees
Germany’s biggest publisher sales houses unite to fight Google, Facebook and Amazon
China’s ByteDance plans to develop its own smartphone
Apple promises privacy, but iPhone apps share your data with trackers, ad companies and research firms
Two days with Curvy Wife Guy, the most controversial man in body positivity
'I replied to a genuine bank tweet and lost £9,200 to a fraudster'
Launches
To Fight Deepfakes, Researchers Built a Smarter Camera
Takes
Why Fiction Trumps Truth
Harvard Professor Falls Victim to Group Outrage
Mark Zuckerberg should hire Microsoft's Brad Smith as CEO, says former Facebook security chief
Pl@ntNet is the world's best social network
And finally ...
David Masad
1999: there are millions of websites all hyperlinked together
2019: there are four websites, each filled with screenshots of the other three.
6:26 AM - 28 May 2019
Talk to me
Send me tips, comments, questions, and videos in which my speech has been slowed down to make me seem drunk: casey@theverge.com.
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue