View profile

From 'leave it up' to 'take it down'

February 23 · Issue #90 · View online
The Interface
Recently I was talking with someone who works at one of the big platforms we like to talk about in this space. The conversation turned to content moderation, and the increasingly thorny questions over which things to leave up and which to take down. Something I failed to appreciate, this person told me, was how fast the pendulum of public opinion had swung. These days companies face public pressure to take things down on a near-daily basis. But until very recently, the pressure companies faced was to leave things up.
Take Twitter. In 2012, emboldened by the way pro-democracy protesters had used its platform during the Arab Spring, the company famously declared itself “the free speech wing of the free speech party.” Its free-speech attitudes were so liberal that the company ignored years of targeted harassment of its own user base, and only belatedly began taking steps to crack down on abuse.
Or take Facebook. In 2016, the company removed an iconic photo of the Vietnam War from the service, because it contained a photo of a naked child. After an outcry, Facebook reversed its position. 
“Because of its status as an iconic image of historical importance, the value of permitting sharing outweighs the value of protecting the community by removal, so we have decided to reinstate the image on Facebook where we are aware it has been removed,” the company said.
Or take YouTube. Efforts to get videos removed from the platform — whether for copyright or First Amendment reasons — are the subject of a long, and impressively international, Wikipedia page. For more than a decade, the company has worked to build an organizational capability to defend against the removal of videos. 
When the company agreed to block access to an anti-Muslim video in 2012, it was a major international news story. A Times report from that year captured the increasing pressure YouTube faced to remove content — and how hard the company worked to avoid doing so, with seemingly little outcry here in the United States.
Requests for content removal from United States governments and courts doubled over the course of last year to 279 requests to remove 6,949 items, according to Google. Members of Congress have publicly requested that YouTube take down jihadist videos they say incite terrorism, and in some cases YouTube has agreed.
Google has continually fallen back on its guidelines to remove only content that breaks laws or its terms of service, at the request of users, governments or courts, which is why blocking the anti-Islam video was exceptional.
Then came the 2016 election, and our frame flipped. Revelations about the extent, and sophistication, of Russian’s disinformation war changed the way we talked about the content on publicly hosted platforms. What were once abstract conversations about free speech absolutism suddenly looked like urgent questions about the future of democracy. Leave it up gave way to take it down.
That’s one way to think about today’s news that InfoWars, a far-right media organization that exists to peddle noxious conspiracy theories, had received its first “strike” from YouTube. Paul P. Murphy had the scoop:
The Alex Jones Channel, Infowar’s biggest YouTube account, received one strike for that video, a source with knowledge of the account told CNN. YouTube’s community guidelines say if an account receives three strikes in three months, the account is terminated.
That video focused on David Hogg, a strong voice among survivors of the mass shooting at Marjory Stoneman Douglas High School. … On Wednesday, YouTube removed the video from InfoWars’ page for violating its policies on harassment and bullying. The video was titled, “David Hogg Can’t Remember His Lines In TV Interview.”
Jones expert Charlie Warzel points out the seeming absurdity of this being InfoWars’ first strike. A previous obsession with the ludicrous Pizzagate conspiracy would seemingly have tripped the same triggers for harassment. (A deranged man who had read about the conspiracies, which held that a Washington pizzeria was the site of a child trafficking ring, was sentenced to four years in prison after firing an assault rifle inside.)
That sentencing happened less than a year ago. One reason I’ve been writing this newsletter five times a week is because we are watching the big platforms’ views on these subjects change in real time — a lesser frequency couldn’t capture it.
The “leave it up” era is over. The “take it down” era is upon us.

Report: There are 149 fact-checking projects in 53 countries. That’s a new high.
YouTube excels at recommending videos — but not at detecting hoaxes
The science of conspiracies: Where Flat Earth meets Pizzagate
RIP Facebook Live: As subsidies end, so does publisher participation
Amazon Targeted in Calls to Drop NRA TV App, Which Is Also on Apple TV, Roku, Chromecast
Meanwhile, the optics of this aren’t great:
Sean Morrow
Facebook is at CPAC and they have a VR shooting game
Colorado police turned to Snapchat to solve a drug murder
Snap is giving some small startups free ads on Snapchat
House of Highlights is turning Instagram into must-watch TV for young sports fans
YouTube Holds Spending for TV, Films While Rivals Bulk Up
Alto’s Odyssey and the art of the perfect sequel
What counterterrorism can teach us about thwarting Russian disinformation
How Trump Conquered Facebook Without Russian Ads
Can Kardashians Trade on Tweets?
And finally ...
How Manafort’s inability to convert a PDF file to Word helped prosecutors
Talk to me
Thanks to the unusually large number of people who took time this week to write in and say they were enjoying the newsletter. I love bringing it to you every day, and your nice notes keep me going! Keep those questions, comments, and weekend plans coming:
Did you enjoy this issue?
In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue