View profile

3 unanswered questions about Facebook hoaxes and threats

Revue
 
Another day, another high-profile outrage spreading virally on Facebook. This time around it's our fr
 
July 24 · Issue #171 · View online
The Interface
Another day, another high-profile outrage spreading virally on Facebook. This time around it’s our frequent subject here Alex Jones, of Infowars, who yesterday went on a rant in which he tiptoed very close to the line of calling for violence against special counsel Robert Mueller. Charlie Warzel has the details in BuzzFeed:
On his Monday afternoon show, Jones issued a prolonged rant against special counsel Robert Mueller, accusing him of raping children and overseeing their rape, and then pantomiming shooting the former FBI director. The show was streamed live on Jones’ personal, verified Facebook page, which has nearly 1.7 million likes.
In the clip, Jones baselessly accused Mueller of having sex with children. “They’d let Mueller rape kids in front of people, which he did,” he said on the show.
Facebook told Warzel the rant did not amount to a credible threat of violence, and left the post up. It had about 46,000 views as of this morning.
Later in the day, Facebook held a previously scheduled conference call with reporters to discuss its work on misinformation and elections. Five executives who work on issues including News Feed integrity, security policy, and elections laid out what they’re doing to improve the service. There were no major new announcements, but the question-and-answer period that followed gave reporters a chance to ask about the Infowars issue.
“We know people don’t want to see false information at the top of their News Feed,” said Tessa Lyons, the head of News Feed Integrity. Lyons went on to say that the company believes it has a responsibility to limit the distribution of hoaxes. And, in cases where those hoaxes have created an imminent threat of harm, Facebook — as of last week, in just two countries — will remove it from the platform.
The current debate over Infowars on Facebook, which is now in its third week, has hit a bit of an impasse. Axios tried to move it forward today with two pieces — one, by Ina Fried, surveying media types about what Facebook should do; and another, by Sara Fischer, offering a broader range of solutions for all of Facebook’s news-related problems.
Both pieces are worth reading, even if Fischer’s in particular comes across as rather pessimistic. (“Facebook may not be able to do much more than it has already tried, unless it makes a drastic change that would impact its business and long-term vision.”)
While we wait for a more comprehensive solution, I’d settle for Facebook answering some questions that never quite found answers on today’s call:
  • What data can Facebook share on the subject of misinformation seeing reduced distribution after being labeled as false or ? The company likes to say that posts get 80 percent fewer views on average, but it would be helpful to see numbers for specific pages. Infowars, for example.
  • Fact-checkers say it takes an average of three days before they are able to label a Facebook post as false. Haven’t most posts already gotten the majority of their lifetime views at that point? Doesn’t that make the strategy of “reduced distribution” significantly less effective?
  • Finally, a question from my boss, Nilay Patel. By what standard does Facebook say Jones’ rant against Mueller did not represent a “credible threat of violence?” When courts make such judgements, Nilay notes, they do so by outlining their reasoning and citing the relevant precedents.
“If Facebook wants to run a legal system,” he says, “it should do that too.”

Democracy
Facebook, Trying to Move Forward in China, Registers a Chinese Subsidiary
Senator Ron Wyden reckons with the internet he helped shape
WhatsApp: WhatsApp races against time to fix fake news mess ahead of 2019 general elections
On WhatsApp, fake news is fast — and can be fatal
Russian Hackers Reach U.S. Utility Control Rooms, Homeland Security Officials Say
Elsewhere
Facebook signs agreement saying it won’t let housing advertisers exclude users by race
Why Do People Share Fake News? A Sociotechnical Model of Media Effects
Twitter is banning users who created their accounts while underage
Snap Spectacles Chief Leaves Company
How Snap Made Direct Response Ads a Big Business
Fake #WalkAway Ads Feature Images Of People From Shutterstock
Mountain View’s unusual rule for Facebook: No free food
Pinterest's head of engineering Li Fan leaves for scooter company Lime
Facebook is succeeding in spite of itself
Launches
Twitter Tightens Process for App Developers to Clean Up Site
Takes
Deepfakes, false memories, and the Mandela effect: AI is coming for our past
Were We Destined to Live in Facebook’s World?
And finally ...
A million Facebook users watched a video that blurs the line between bad satire and ‘fake news’
Talk to me
Questions? Comments? Mueller rants? casey@theverge.com
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue