View profile

Mark Zuckerberg tries to contain a crisis

November 15 · Issue #248 · View online
The Interface
Facebook’s day was consumed with the fallout from Wednesday’s New York Times story about its slow response to Russian interference, which generated a furor greater than anything the company has seen since the Cambridge Analytica data privacy scandal. The company answered its critics, put Mark Zuckerberg on the phone with reporters for an extended question-and-answer session, and moved to shift the conversation to some important moves it is making around content moderation. It’s not yet clear whether the moves will cause the outrage to subside — or whether, as happened with Cambridge Analytica, it will metastasize over the coming days and weeks.
Let’s take a look at the day’s most important developments, in chronological order.
First, Facebook responded to the Times in a point-by-point rebuttal. You can read the blog post here. The company’s main objection to the Times piece is the suggestion that it sought to downplay or cover up Russian interference on the platform before the election. Facebook also says no one discouraged its chief security officer, Alex Stamos, from investigating the Russia problem. (It did not dispute the story’s assertion that Sheryl Sandberg, the chief operating officer, had criticized Stamos for going somewhat rogue with his investigation and possibly leaving the company exposed legally.) The board had Zuckerberg’s back, issuing a statement touting the company’s progress in fighting misinformation.
Second, Facebook held a press call to discuss its second community guidelines enforcement report. The report, which is new as of this year, measures the amount of content policing that Facebook does across its network. It now plans to release such a report quarterly; you can read the new one here; or read Adi Robertson’s helpful gloss here. Big takeaways: governments continue to ask Facebook to take down more and more information; Facebook is reporting levels of bullying and harassment and child exploitation for the first time; and the company deleted 1.5 billion fake accounts in the past six months.
Third, Mark Zuckerberg posted a 4,500-word “blueprint” on the future of content moderation on Facebook. You can read that post here. The post had at least two highly consequential announcements. One, Facebook will once again move to reduce sensationalist content from the News Feed. What struck me was the language Zuckerberg used to discuss this issue — it’s different than anything he has said before. And it goes to the heart of social networks’ role in creating a polarized, destabilized electorate:
One of the biggest issues social networks face is that, when left unchecked, people will engage disproportionately with more sensationalist and provocative content. This is not a new phenomenon. It is widespread on cable news today and has been a staple of tabloids for more than a century. At scale it can undermine the quality of public discourse and lead to polarization. In our case, it can also degrade the quality of our services.
Our research suggests that no matter where we draw the lines for what is allowed, as a piece of content gets close to that line, people will engage with it more on average — even when they tell us afterwards they don’t like the content.
Zuckerberg says Facebook will “train AI systems to detect borderline content so we can distribute that content less.” It remains to be seen how effective AI will be at that task, or what tradeoffs are involved. But it may be among the most important things Facebook does in the next year.
The other major announcement: an independent oversight body to review appeals for content removals. Zuckerberg first discussed the idea of a “Facebook Supreme Court” with Ezra Klein in April; I wrote about why such a body was necessary in this space in August, during the Alex Jones imbroglio. I asked Zuckerberg today whether he thought the body should publish its opinions, creating a kind of case law; he told me that he did. The body won’t be up and running until the end of 2019 at the earliest, but when it arrives we can expect a growing body of social network jurisprudence, and it’s going to be fascinating to watch.
Fourth, Zuckerberg answered questions about the most damning elements of the Times‘ report. Reporters focused on the company’s decision to hire Definers Public Affairs, a Washington, DC-based public relations and opposition research firm. Facebook fired the firm Wednesday night. Zuckerberg said that neither he nor Sandberg knew that Definers even worked for them. This strained credulity, as Facebook’s own blog post noted that “our relationship with Definers was well known by the media — not least because they have on several occasions sent out invitations to hundreds of journalists about important press calls on our behalf.” (I had item about Definers here in February.)
The Definers issue went nuclear for two reasons. One, the company circulated a document that attempted to link criticism of Facebook — wrongly, it turns out! — to George Soros. Linking events to Soros, a liberal philanthropist who escaped the Holocaust, is a well-worn tactic of anti-Semites. And Zuckerberg and Sandberg, of course, are Jewish.
Two, Definers employees what one former employee told NBC News was “an in-house fake news shop” to push messages into the broader media ecosystem. Michael Cappetta, Ben Collins and Jo Ling Kent report:
Definers runs a website called NTK Network, which has a verified page on Facebook with more than 120,000 followers that publishes and promotes articles about the firm’s clients as well as their competitors.
A former employee of Definers, who asked not to be identified in order to protect professional relationships, told NBC News that NTK Network was “our in-house fake news shop.” Some clients would actively pay for NTK Network’s positive coverage, which the ex-employee said would then be pushed out through Facebook in the hopes of being picked up by larger conservative media outlets such as Breitbart.
And indeed, the Times found that NTK pushed dozens of pro-Facebook and anti-Facebook competitor messages during its time of employment with the company, some of which were picked up by Breitbart. For a company that has spoken loudly and often over the past year about its commitment to reduce the spread of misinformation, the fact that it had hired a crisis communications agency to actively spread misinformation was hypocrisy of the rankest sort. Definers had to go.
Zuckerberg suggested this was some sort of rogue operation:
“We certainly never asked them to spread anything that is not true. That’s not how we want to operate. In general, I think a lot of DC-type firms might do this kind of work. I understand why other companies might want to work with them, but that’s not the way I want to run this company.”
It’s a line that would have been more credible had Zuckerberg not run the same play in the past. In 2011, it hired Burston-Marsteller to write scaremongering stories about Google privacy policies. (Microsoft had hired it to do the same thing.) Incredibly, Facebook got away with a “no comment” at the time.
The common thread in both episodes, beyond Facebook’s CEO and COO, is the company’s now-former head of communications, who would have been responsible for both: Elliot Schrage. Schrage stepped down in June. The next time he sits down with a reporter, I hope he’ll be asked about how he views the role of companies like Definers and Burston-Marsteller in promoting a company’s interests.
Fifth and finally, everyone is mad. George Soros called for an investigation. Sen. Richard Blumenthal is mad. Sen. Mark Warner is mad. Sen. Ben Sasse is mad. Sen. Ron Wyden is mad. The comptroller of New York is mad. Sen. Amy Klobuchar, who the Times story suggested had eased up on her criticisms after being personally lobbied by Sandberg, said she planned to ask the Justice Department to investigate potential violations of campaign finance laws.
In Silicon Valley, Kurt Wagner wonders who is going to be fired over this. (Zuckerberg was asked this question several times on the call, and demurred, other than to say it won’t be Sandberg.) Berkeley students say they won’t even consider working for Facebook. Alex Stamos is mad at the mass media for failing to examine its own role in the Russia story.
But I’ll end where I started: this particular Facebook scandal has gotten the attention of regular people. It’s the sort of scandal that has led friends from high school and college to text me asking what’s going on. Three of them in recent days have either deleted or deactivated their Facebook accounts. After two years of final straws, the events of this week have offered them another. “Facebook just filled with crazy idiots now,” The Onion said.
The headline was the whole story, in both senses of the phrase.

The Racist Backlash To The Migrant Caravan Is Building In WhatsApp Groups In Mexico
Nigerian police say “fake news” on Facebook is killing people
Human rights groups want Facebook to offer ‘due process’ for takedowns
A $2 Billion Question: Did New York and Virginia Overpay for Amazon?
Ramzan Kadyrov Got His Instagram Back For A Few Hours After Being Kicked Off Last Year
Facebook’s top lawyer — who said he was leaving the company — isn’t leaving after all, because Facebook is still in crisis
TikTok surges past 6M downloads in the US as celebrities join the app
From QAnon to Pizzagate, When Online Conspiracies Form Cults
Why Fox News Hasn’t Tweeted for a Week
A Facebook patent would use your family photos to target ads
Twitter's Explore tab starts sorting stories into sections
Instagram starts rolling out dashboard that shows how much time you spend on it
Instagram will now let users shop items from video posts
Amazon’s Golden Fleecing
New York’s Amazon Deal Is a Bad Bargain
And finally ...
We sang Backstreet Boys I want it that way
My most sincere condolences to BuzzFeed reporter Ryan Mac, who had hoped to ask Zuckerberg about this (incredibly timed) Kanye tweet from Wednesday evening. Zuckerberg extended the questioning period twice on Thursday, to his great credit, but sadly Mac was never called on.
Ain’t nothin’ but a heartache.
Talk to me
Send me tips, comments, questions, and your nomination to Facebook’s independent oversight body:
Did you enjoy this issue?
In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue