View profile

More Democrats consider breaking up Facebook

Revue
 
I. On Thursday, Facebook co-founder Chris Hughes said the company should be broken up, and required
 
May 13 · Issue #328 · View online
The Interface
I.
On Thursday, Facebook co-founder Chris Hughes said the company should be broken up, and required to spin off WhatsApp and Instagram. Increasingly, Democratic presidential candidates agree with him. Sen. Elizabeth Warren had already issued her call to break up big tech companies; yesterday, Sen. Kamala Harris said “we have to seriously take a look” at breaking up Facebook. Today, Joe Biden said a Big Tech breakup is “something we should take a really hard look at.”
And Facebook, for its part, finally responded at length. First, Nick Clegg, the company’s head of communications and policy, had an op-ed in the New York Times. He argued that the company has plenty of meaningful competition, and that breaking it up would only worsen problems around unchecked free speech and data protection.
Big in itself isn’t bad. Success should not be penalized. Our success has given billions of people around the globe access to new ways of communicating with one another. Earning money from ads means we can provide those tools to people for free. Facebook shouldn’t be broken up — but it does need to be held to account. Anyone worried about the challenges we face in an online world should look at getting the rules of the internet right, not dismantling successful American companies.
Mark Zuckerberg struck a similar note in an interview with reporters in France, where he had traveled to meet with President Emmanuel Macron. “When I read what he wrote, my main reaction was that what he’s proposing that we do isn’t going to do anything to help solve those issues. So I think that if what you care about is democracy and elections, then you want a company like us to be able to invest billions of dollars per year like we are in building up really advanced tools to fight election interference.”
In other words, only a company of Facebook’s size can afford to address the problems caused by a platform of Facebook’s size.
A counter-argument might be that the negative externalities of platforms scale along with their size. A web forum where a few dozen people meet to espouse white supremacist ideals might offend us, but if the mechanics of the platform do not allow it to recruit others, it’s arguably just free speech. It’s hard to see how the government could justify shutting it down, although the forum’s web hosting provider would retain the right to do so.
On the other hand, a platform with billions of people that steers people into fringe conspiracy groups via its recommendation algorithms arguably poses a different set of problems to the world. The platform’s viral sharing mechanics, combined with its scale, makes it complicit in the spread of misinformation, terrorism and hate speech.
I understand that Facebook has more money to spend on this problem than the average web forum. But I don’t understand how Facebook can credibly divorce the scale of its user base from the scale of its consequences.
A fair question to ask is what we might expect if Facebook no longer funded platform integrity efforts on WhatsApp and Instagram (and no longer had revenues from those platforms to fund its own work.) If the separated platforms had 1.5 billion monthly users apiece, would they be able to protect them effectively from bad actors?
Facebook says no, but then Facebook has historically been bad at predicting how people will abuse the service. To me it seems just as possible that the separate companies will compete on platform integrity, generating useful new solutions for the others to shamelessly copy. Facebook has been slow and reactive when it comes to security and data protection efforts; it seems possible another company would act more nimbly.
In any case, Facebook has another approach here that it prefers: France’s. After six months in which French regulators worked inside the company "monitoring its policies,” which I would like someone to please make a multi-camera sitcom about, the government released a 33-page report with recommendations for how regulation should work. The report “recommends that French authorities should have more access to Facebook’s algorithms and greater scope to audit the company’s internal policies against hate speech,” Mathieu Rosemain and Gwénaëlle Barzic report. And Zuckerberg likes it:
“If more countries can follow the lead of what your government has done here, that will likely end up being a more positive outcome for the world in my view than some of the alternatives,” Zuckerberg told reporters at Facebook’s Paris office after the meeting at the Elysee palace.
“We need new rules for the internet that will spell out the responsibilities of companies and those of governments,” he told France 2 television in an interview. “That is why we want to work with the team of President Macron. We need a public process.”
Will Facebook be broken up, or will it simply submit to some new monitoring regime? (The company is lately very happy to submit to monitoring regimes.) The latter seems more likely to be, but I’ve been somewhat taken aback at how quickly the former has become a litmus test for Democratic presidential candidates. If we’re all still talking about this when the primary debates start, I’d say the prospect of a breakup would be a bit more likely.
II.
Hughes’s op-ed kicked off a media tour that included, among other things, a campaign-ad-style YouTube video, an appearance on The Daily, and an interview with Kara Swisher. I was surprised at how like a politician Hughes sounded on YouTube, and how squirmy and uncomfortable he sounded on podcasts.
One thing that became clear to me on Thursday is that lots of current and former Facebook employees don’t think very much of Hughes. An important reason why can be found in his interview with Swisher:
Did you miss doing that, leaving Facebook?
You know, I had mixed feelings about it, but my experience was really different than Mark’s and Dustin’s. I mean, Facebook was a mission in and of itself for Mark, and for me it was a company that I enjoyed being a part of, growing. I learned a lot, it was exciting, there were all kinds of challenges, but it was clear to me early on that Facebook was not my life’s work.
Hughes doesn’t believe in Facebook’s mission today — but he didn’t really believe in it then, either. Externally, this doesn’t much matter — he’s still a co-founder of Facebook, and his opinion will be duly noted whenever we write about the case for breaking up the company. But internally, Hughes will continue to be dismissed as a historical footnote.
And speaking of co-founders: In my Thursday newsletter on the subject, I noted that another co-founder of Facebook, Dustin Moskovitz, had donated to Color of Change, which is trying to persuade Facebook shareholders to vote against Zuckerberg being re-nominated to the company’s board. To me the donation was telling, given that Moskovitz has said almost nothing about Facebook publicly in the past couple of years. But on Twitter, Moskovitz told me that the donation was intended only to help Democrats in the 2016 election, and that he should not be characterized as a Facebook critic.
When I asked what his argument against a breakup was, he said: “If the goal is to improve democracy we should break up Fox and Sinclair first.” He later deleted the tweet.

The Trauma Floor
Three months after I reported on the working conditions at Facebook’s content moderation facilities, and with multiple class-action lawsuits brewing, the company said today it would increase the minimum pay by 20 percent and take new steps to monitor and improve workers’ mental health:
In February, The Verge reported that Facebook contractors in Phoenix are suffering from long-term mental health issues after working as content moderators. Their jobs require them to view a steady stream of violent and disturbing content, and several moderators told us they continued to struggle with PTSD-like symptoms. Other moderators told us that the work had made them more likely to believe in the fringe conspiracy theories that they encountered each day at work.
In response, Facebook said it would now require its vendors to provide on-site counseling during all hours of operation, rather than only during the day shift. It will also begin surveying contractors about their mental health twice a year “and use the results to shape our programs and practices,” the company said.
This is great news.
Democracy
Supreme Court says Apple will have to face App Store monopoly lawsuit
Exclusive: India orders anti-trust probe of Google for alleged Android abuse - sources
Russia Is Targeting Europe’s Elections. So Are Far-Right Copycats.
Your 5G Phone Won’t Hurt You. But Russia Wants You to Think Otherwise.
YouTube Has Downgraded Carl Benjamin's Sargon Of Akkad Account After He Talked About Raping A British MP
Turkish watchdog says it fines Facebook $271,000 for data breach
Elsewhere
Fear-based social media Nextdoor, Citizen, Amazon’s Neighbors is getting more popular
Swatting Attacks Increase Security Concerns Across Silicon Valley
How Money Flows From Amazon to Racist Troll Haven 8chan
'Fake News Victims' Meet With Twitter and Facebook
Twitter bug disclosed some users’ location data to an unnamed partner
This doctor posted online in favor of immunization. Then vaccine opponents targeted her
Discord, Slack for gamers, tops 250 million registered users
Launches
Spotify is testing its own version of Stories called ‘Storyline’
Takes
Friend portability is the must-have Facebook regulation
Facebook Algorithms Make It Harder to Catch Extremists
Instagram Is Trying to Curb Bullying. First, It Needs to Define Bullying.
And finally ...
‘Old Town Road’: See How Memes and Controversy Took Lil Nas X to No. 1
Talk to me
Send me tips, comments, questions, and your opinion of French internet regulations: casey@theverge.com
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue