View profile

Zuckerberg delivers a progress report on election security

Since the aftermath of the 2016 election, Facebook has invested millions of dollars in an effort to s
September 13 · Issue #205 · View online
The Interface
Since the aftermath of the 2016 election, Facebook has invested millions of dollars in an effort to shore up the platform against future attacks. Late Wednesday night, Mark Zuckerberg published a 3,300-word progress report on how the company has been doing.
The report contained little in the way of news. The steps that Zuckerberg outlined have been announced publicly every step of the way. They include:
  • Removing fake accounts.
  • Removing posts that use hoaxes to incite violence.
  • Preventing publishers of hoaxes from selling ads against their content.
  • Forming partnerships with nonpartisan fact checkers to rate disputed posts.
  • Requiring advertisers to verify their identities and allowing the public to see relevant information about all ad campaigns on Facebook.
  • Setting up an independent election research commission to let outside academics examine the influence of social networks on democracy.
  • Coordinating with other social platforms and the government to identify and remove influence campaigns.
Zuckerberg concludes:
In 2016, we were not prepared for the coordinated information operations we now regularly face. But we have learned a lot since then and have developed sophisticated systems that combine technology and people to prevent election interference on our services.
This effort is part of a broader challenge to rework much of how Facebook operates to be more proactive about protecting our community from harm and taking a broader view of our responsibility overall.
Given that we don’t fully understand the nature of the threats on the platform — many are uncovered only after the fact — it’s impossible to say with any certainty how effective these measures have been. Still, Zuckerberg’s post highlights a series of earnest, good-faith efforts at Facebook to prevent the problems that marred the 2016 election from happening again. This list compares favorably to efforts at YouTube and Twitter, which generally have been slower to act and less forthcoming about what they’re doing.
At the same time, a post like Zuckerberg’s can encourages us to assess the company’s efforts by how hard Facebook is trying. Because it is written by the founder, Zuckerberg’s note has the feel of a quarterly self-evaluation. You read all 3,300 words and think, gosh, he’s working hard on the problem! Which, of course, he is.
But I think this is the wrong way to think about things.
Earlier this week I wrote about the limits of CEO interviews, which center the feelings of the founder rather than the consequences of their actions. Then the Guardian’s Julia Carrie Wong came along and put it better than I did: “It’s time for tech journalism to move away from the idea that we can understand this industry by understanding the great men who built it,” she tweeted. “What does it matter that we understand Zuck when Zuck himself so clearly doesn’t even understand Facebook?”
The indispensable Matt Levine picks up on Wong’s tweet in his newsletter today and extends the argument:
No one at Facebook sat down to build an election interference function. They sat down to build a system for purposes that they thought were good, and are happy to brag to you about: sharing baby pictures, connecting the world, making piles of money by showing you ads, that sort of thing. All — most, anyway — of the bad effects of Facebook are emergent features of the system that they built for the good effects; that system itself, and its messy interactions with billions of people out in the real world, creates the bad effects.
I don’t mean to claim that Zuckerberg, or anyone else at Facebook, is or is not responsible in some moral or legal sense for the bad effects of Facebook, or that those effects could or could not or should or should not have been predicted, or that they can or can’t be fixed, or whatever. I just mean to endorse Wong’s claim that if you want to understand Facebook, the main thing you have to understand is Facebook, the product and architecture and algorithms and effects and interactions, the system of it. Understanding the people who built it is not a substitute for that, because the system has moved beyond their conscious control. Facebook does things in the world that are not directly willed by the people who built it; to understand and predict those things, you don’t interview its founder, you examine its workings.
This is why I reject Zuckerberg’s idea that the fight against bad actors on Facebook is an “arms race.” The military metaphor is helpful to Facebook in part because it’s so easy to visualize. The Kremlin builds one missle; Facebook builds a bigger one. This metaphor suggests that the sides are of equal power: the good guys and the bad guys are fighting neck and neck, with the lead swinging back and forth depending on the day. Facebook uses the “arms race” language, in other words, because it flatters Facebook.
But the other view — Levine’s emergent-systems view — doesn’t allow for such a rosy assessment. Building a bigger arsenal — of artificial-intelligence tools, or advertiser requirements, or whatever — won’t necessarily meet the challenge ahead. This isn’t conventional warfare; it’s guerilla warfare. It’s not the Cold War, where “arms race” first entered our vocabulary; it’s the Vietnam War.
And I probably don’t have to tell you how the imperial power fared in that one.

Senior Google Scientist Resigns Over “Forfeiture of Our Values” in China
U.S. lawmakers ask Google if it will rejoin Chinese market
Facebook ramps up effort to combat fake images, video
Everything you need to know about Europe’s new copyright directive
Trump’s Tweets Pivot, Loudly, to Video
Is it time for a ‘slow food’ movement for the internet?
Where in the World Is Larry Page?
Fans Are Spoofing Spotify With "Fake Plays," And That's A Problem For Music Charts
From “uncool uncle” to “fun” “best friend”: Why people are turning from Facebook to…other Facebook-owned things for news
Snap opens up crowdsourced ‘Our Story’ content to news organizations like CNN and NBC News - The Verge
New tools for parents and content for older kids in the YouTube Kids app
Facebook’s new ‘SapFix’ AI automatically debugs your code – TechCrunch
The Real Google Censorship Scandal
America’s always-on partisan goggles hurt meaningful evaluation of fact-checking on Facebook
And finally ...
The Colin Kaepernick Nike Protests Have Made Idiots Of Us All
Talk to me
Send me tips, comments, questions, corrections, and what you are doing to protect the midterm elections:
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue