View profile

Mark Zuckerberg asks: "Who chose me?"

Revue
 
Until 9PM ET last night, the world thought we would be getting one interview with Mark Zuckerberg. An
 
March 22 · Issue #103 · View online
The Interface
Until 9PM ET last night, the world thought we would be getting one interview with Mark Zuckerberg. And then the clock struck 9, and three other outlets published extensive interviews of their own: Recode, The New York Times, and Wired
The journalists granted interviews with Facebook’s CEO did themselves proud. They asked about the full range of subjects we cover here each day, breaking new ground in several areas:

  • He expressed confidence Facebook could be protected from bad actors ahead of the midterm elections. “This isn’t rocket science. There’s a lot of hard work we have to do to make it harder for nation states like Russia to do election interference,” he told CNN. “But we can get in front of this.”
  • He told the Times that the company had deployed unspecified new artificial intelligence to fight bad actors in the recent Alabama Senate election: “In last year, in 2017 with the special election in Alabama, we deployed some new A.I. tools to identify fake accounts and false news, and we found a significant number of Macedonian accounts that were trying to spread false news, and were able to eliminate those. And that, actually, is something I haven’t talked about publicly before, so you’re the first people I’m telling about that.”
  • He further told the Times that a “meaningful” people had not deleted their accounts in the wake of the controversy: “I don’t think we’ve seen a meaningful number of people act on that, but, you know, it’s not good. I think it’s a clear signal that this is a major trust issue for people, and I understand that. And whether people delete their app over it or just don’t feel good about using Facebook, that’s a big issue that I think we have a responsibility to rectify.”
The response that struck me the most, though, was this exchange with Recode, which Kurt Wagner broke out into a separate article:

“What I would really like to do is find a way to get our policies set in a way that reflects the values of the community, so I am not the one making those decisions,” Zuckerberg said. “I feel fundamentally uncomfortable sitting here in California in an office making content policy decisions for people around the world.”
But then Zuckerberg said something else we haven’t heard before, which is that even though making these kinds of policy decisions make him uncomfortable, he may no longer have a choice.
“Things like, ‘Where’s the line on hate speech?’ I mean, who chose me to be the person that did that?” Zuckerberg continued. “I guess I have to, because we’re here now, but I’d rather not.”
On one hand, this position is consistent with Facebook’s general aversion to editorial decision-making. On the other, it’s striking to hear Zuckerberg reckoning with the full weight of the responsibility that he has accepted. The answer to “who chose me” is of course Zuckerberg himself, who has ruthlessly pursued growth around the world, bought up or attempted to strangle every competitor, and worked to cement his total and individual control over the company.
Given these facts, it’s perhaps comforting that Zuckerberg says he hopes to avoid “making content policy decisions for people around the world.” But who can be trusted to make those content policy decisions, and who ensures that they are evenly enforced, remains a significant challenge.
At the Times, Kevin Roose asked Zuckerberg how he felt about the situation in Myanmar, where the United Nations has blamed Facebook for spreading hate speech that has incited violence against the minority Rohingya population. Elsewhere in his interviews, the CEO came across as warm and empathetic. But his response here struck me as rather cold:
It’s certainly true that, over the course of Facebook, I’ve made all kinds of different mistakes, whether that’s technical mistakes or business mistakes or hiring mistakes. We’ve launched product after product that didn’t work. I spend most of my time looking forward, trying to figure out how to solve the issues that people are having today, because I think that’s what people in our community would want.
Technical mistakes, business mistakes, hiring mistakes: errors of these kind are much less consequential, in the long run, of the murders of innocent people caught up in the spread of viral hoaxes and calls for violence. Zuckerberg’s interview tour strikes me as a net positive for the company: it seemed to slow the bleeding around the Cambridge Analytica scandal, without quite stopping it.
But as I noted here yesterday, Facebook is currently confronting multiple crises. Today, data privacy is on everyone’s mind. But once that fades, the Rohingya won’t be better off. Nor will the Muslim minority in Sri Lanka, where Facebook recently had to temporarily shut down its service in response to a wave of violence being carried out using Facebook-owned apps. 
It feels odd to describe challenges like these in the terms of “what the community wants.” “I spend most of my time looking forward,” Zuckerberg said, and I wonder whether if he would not be better served in this moment by simply looking around.

Democracy
A key congressional committee has asked Facebook CEO Mark Zuckerberg to testify about Cambridge Analytica
Here’s the transcript of Recode’s interview with Facebook CEO Mark Zuckerberg about the Cambridge Analytica controversy and more - Recode
Zuckerberg’s Crisis Response Fails to Quiet Critics
Facebook Exec: ‘We’ve Been Caught Flat-Footed’ on Data Scandal
Mark Zuckerberg and Facebook's battle to kill the Honest Ads Act
The Cambridge Analytica Whistleblower Said He Wanted To Create “The NSA’s Wet Dream”
Facebook gave data about 57bn friendships to academic
Read the full Kogan email: Researcher says Facebook is scapegoating him
Kenyans face a fake news epidemic. They want to know just how much Cambridge Analytica and Facebook are to blame.
Ivanka Trump visit to Salon Spa W in Des Moines, Iowa sparks social media backlash
Under Fire and Losing Trust, Facebook Plays the Victim
My Cow Game Extracted Your Facebook Data
Elsewhere
Facebook scandal could push other tech companies to tighten data sharing
Mozilla Presses Pause on Facebook Advertising
A Brief History Of Mark Zuckerberg Apologizing (Or Not Apologizing) Fo
‘The data hasn’t gone away’: How Facebook opened Pandora’s box of user data and has struggled to shut it
Mark Zuckerberg Promises That Misuse Of Facebook User Data Will Happen Again And Again 
How the Crowd Led Us to Investigate
How the Internet Breaks Your Brain
Launches
Instagram will be better about showing you new pictures first
Some cruel pranksters are trying to trick Dave Morin into relaunching his failed Facebook clone, Path. If this works it will be absolutely amazing!!
DAVE MORIN
Overwhelmed by requests to rebuild a better @Path. Considering doing it. If you are interested in working on such an idea, DM me. Let's see if a passionate team forms. If so, we'll do it.
5:59 PM - 21 Mar 2018
Takes
As part of my extreme thirst to experiment with new media platforms, I worked with The Verge’s ace video crew to put together a video explainer of the Cambridge Analytica story for YouTube. Smash that like button fam!
Do not @ me about this thumbnail, I did not choose it
Cambridge Analytica and Our Lives Inside the Surveillance Machine
Mark Zuckerberg Finally Speaks About Cambridge Analytica; It Won't Be Enough
Cigarettes are the vice America needs
Facebook Needs a 'Seinfeld' Lesson on Crisis Management
And finally ...
Massive Attack Leave Facebook Due to Cambridge Analytica Scandal
You might say that Facebook is shedding a Teardrop!
What do you know that I don't?
casey@theverge.com or DM me for my Signal!
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue