I. A dilemma
On Sunday night, after being encouraged to by friends and family, I hit play on a new documentary about our digital lives. Directed by Jeff Orlowski,
The Social Dilemma explores the effect of smartphones and social networks on human behavior. Blending talking-head interviews with some well known Silicon Valley apostates and fictional, after-school special-style dramatizations of what happens when Johnny and Janey scroll through feeds all day, the film presents itself as an urgent warning about our modern condition.
Iām more than a little sympathetic to these concerns. I started writing this newsletter in 2017 after coming to the belated realization that social networks really did have an outsized impact on modern life, and deserved to be taken as seriously. My thinking has benefited tremendously from speaking over the years with some of the interview subjects in the film, including Tristan Harris, Renee DiResta, Tim Kendall, Jeff Seibert, and Justin Rosenstein. In particular, Harrisā work on screen time triggered a powerful sea change in the industry, and DiRestaās explorations of misinformation have been essential to helping social networks understand themselves.
And yet despite all that ⦠the film is ridiculous? The dramatized segments include a fictional trio of sociopaths working inside an unnamed social network to design bespoke push notifications to distract their users. They show an anguished family struggling to get the children to put their phones away during dinner. And the ominous piano score that pervades every scene, rather than ratcheting up the tension, gives it all the feeling of camp. If someone asked me to reimagine this newsletter as a drag show, I would start where The Social Dilemma leaves off.
And as Adi Robertson points out at
The Verge, the idea that algorithmic recommendation engines are at the heart of our troubles leaves out vast swathes of the internet that are arguably just as important as the big social networks, and perhaps in some cases even more so.
She writes:
Propaganda, bullying, and misinformation are actually far bigger and more complicated. The film briefly mentions, for instance, that Facebook-owned WhatsApp has spread misinformation thatĀ
inspired grotesque lynchings in India. The film doesnāt mention, however, that WhatsApp works almost nothing like Facebook. Itās a highly private, encrypted messaging service with no algorithmic interference, and itās still fertile ground for false narratives. AsĀ
Alexis Madrigal notes, condemning the platforms together comes āuncomfortably close to admitting that mobile communications pose fundamental challenges to societies across the world.ā Thereās a fair case for that, he argues ā but a case with much more alarming implications.
Radicalization doesnāt just happen on Facebook and YouTube either. Many of the deadliest far-right killers were apparently incubated on small forums: Christchurch mosque killer Brenton Tarrant on 8chan; Oregon mass shooter Chris Harper-Mercer on 4chan; Tree of Life Synagogue killer Robert Bowers on Gab; and Norwegian terrorist Anders Breivik on white supremacist sites including Stormfront, a 23-year-old hate siteĀ
credited with inspiringĀ scores of murders.
These sites arenāt primarily driven by algorithms or profit motives. Instead, they twist and exploit the open internetās positive ability to connect like-minded people. When harmful content surfaces on them, it raises complex moderation questions for domain hosts and web infrastructure providers ā a separate set of powerful companies that have completely different business models from Facebook.
This isnāt to let social networks off the hook. Nor is it an effort to make the problem feel so complicated that everyone just throws their hands up and walks away from it. But Iām shocked at how appealing so many people find the idea that social networks are uniquely responsible for all of societyās ills. (The Social Dilemma has been among the 10 most watched programs on Netflix all week.)
This cartoon super villain view of the world strikes me as a kind of mirror image of the right-wing conspiracy theories which hold that a cabal of elites are manipulating every world event in secret. It is more than a little ironic that a film that warns incessantly about platforms using misinformation to stoke fear and outrage seems to exist only to stoke fear and outrage āĀ while promoting a distorted view of how those platforms work along the way.
Some folks who worked on the film told me that this kind of approach is necessary to ā
communicate in a way that appeals to a broad audience.ā But I say thatās a cop out. If youāre going to argue that social platforms are uniquely responsible for the fraying of society, you have to show your work.
II. A memo
On the other hand, meet Sophie Zhang. She was a data scientist who was fired in August and left this month in the fashion increasingly popular among departing Facebook employees ā which is to say, quite dramatically.
āIn the three years Iāve spent at Facebook, Iāve found multiple blatant attempts by foreign national governments to abuse our platform on vast scales to mislead their own citizenry, and caused international news on multiple occasions,ā wrote Zhang, who declined to talk to BuzzFeed News. Her LinkedIn profile said she āworked as the data scientist for the Facebook Site Integrity fake engagement teamā and dealt with ābots influencing elections and the like.ā
āI have personally made decisions that affected national presidents without oversight, and taken action to enforce against so many prominent politicians globally that Iāve lost count,ā she wrote.
She added: āI know that I have blood on my hands by now.ā
Unlike the Social Dilemma filmmakers, Zhang showed her work ā first to her bosses, and then, inadvertently, to the world. She describes governments in Azerbaijan and Honduras using Facebook against their own citizens, employing large numbers of fake accounts to promote their own interests and attack critics. And she found what she described as coordinated influence campaigns in countries including India, Ukraine and Bolivia.
Zhangās official job was to police Facebook for āfake engagementā ā people buying inauthentic likes, comments, and shares. From this perch she continually wandered into an adjacent realm that Facebook calls ācivic integrity,ā to the apparent frustration of her bosses. Itās a higher-stakes realm that works on some of the most pressing issues a social platform will face, including foreign influence operations and election integrity. Itās also famed for its difficulty āĀ academics tell me that unearthing these operations and properly attributing them requires significant domain expertise. Many of the people who do this at Facebook and other networks previously worked for US intelligence agencies.
Zhang, by contrast, was a relatively junior employee who was essentially moonlighting on civic integrity issues. That may have been one reason why she struggled to get her colleaguesā attention, Iām told. Everyone Iāve spoken to at Facebook over the past day says Zhang was bright and dedicated to her work. But navigating large organizations can be a challenge even for the most senior employees, and it seems like Facebookās sheer size often prevented Zhangās findings from getting prompt attention.
After BuzzFeedās story ran, some people who work on the companyās integrity team āĀ and there are more than 200 of them ā were frustrated at the implication that they are sitting on their hands all day, or otherwise bad at their jobs. (I donāt think Zhang meant to imply this, but that was certainly the tenor of the discussion about BuzzFeedās story on Twitter.) Many of them had worked with Zhang on the takedown efforts she described, and felt undermined by her memo, Iām told. Sometimes team leaders set priorities differently than their own employees would, and Facebookās efforts ā which focus on the largest and most active networks, particularly during elections āĀ sometimes set aside other legitimate threats, like the ones Zhang had found.
Ultimately, thatās the aspect of Zhangās memo that sticks. Facebook mostly doesnāt deny that her findings were accurate, significant, and sometimes received delayed responses. The company says only that the issues she found, however significant, were less pressing than the many other issues the civic integrity team was policing at the time, in other countries all over the world.
Mastering the geopolitics of each country and rooting out every influence operation that pops up while also policing hate speech and misinformation while promoting free speech and interpersonal connections is a mind-bendingly enormous task. But itās also the task that Facebook, by virtue of its huge investment in growth and fighting off competitors over the years, has signed up for.
I canāt take seriously a film like The Social Dilemma, which seemingly wants to hold one company accountable for every change society has undergone since it was founded. But when someone takes her employer to task for the things she found on its service ā and she leaves with a feeling of blood on her hands ā thatās something different.
Not every issue raised by an employee will get immediate attention. But Zhangās memo raises questions about Facebookās size, power, and accountability to its users āĀ particularly its non-Western users ā that outsiders have been poking at for years. Increasingly, as we have learned from a summer rife with Facebook leaks, those calls are now coming from inside the house. And they deserve better answers than Sophie Zhang has gotten to date.