, obtained through The Markup’s Citizen Browser project, in partnership with Germany’s Süddeutsche Zeitung
, came from a diverse panel of 473 German Facebook users who agreed to share their Facebook feeds with us.
Facebook objected that our sample was too small to be accurate. “Given its very limited number of participants, data from The Markup’s ‘Citizen Browser’ is simply not an accurate reflection of the content people see on Facebook,” Facebook spokesperson Basak Tezcan said in an emailed statement.
But German political researchers said our findings do reflect just how well the AfD has mastered the type of sensational content that performs well in Facebook’s algorithmic feeds. Consider an AfD post displayed in our dataset that complained about “climate hysteria” and led to more than 5,000 “angry” reactions on Facebook. “They [AfD] trigger anger, fear—I would say anarchic or basic emotions,” Isabelle Borucki, an interim professor at the University of Siegen who studies German political parties, told The Markup.
Meanwhile, Facebook also took steps that effectively blocked research like our Citizen Browser panel. The Markup’s Corin Faife reported this week that Facebook has rolled out changes to its website code that foil automated data collection
by jumbling some code in accessibility features used by visually impaired users.
Facebook responded that it did not make the code changes to thwart researchers and that it was looking into whether the change impacts visually impaired readers.
Laura Edelson of the NYU project said it’s “unfortunate that Facebook is continuing to fight with researchers rather than work with them.” And U.S. senator Ron Wyden told The Markup, “It is contemptible that Facebook would misuse accessibility features for users with disabilities just to foil legitimate research and journalism.”
And one way that it stays so profitable is by under-investing in harm mitigation. At the end of 2020, Facebook said it had 2.8 billion monthly active users and a global workforce of 58,604. That means that even if each Facebook worker were engaged in policing harm (which they are not, since many are focused on building technology or other things), each worker would be responsible for monitoring 47,778 Facebook user accounts, not to mention all of the comments and groups and marketplace postings associated with those users.
In other words, the mathematics of running a wildly profitable social network loved by Wall Street is not particularly conducive to running a police force that keeps activities on the platform in check.
And so, until that calculation changes, journalists like us at The Markup and The Wall Street Journal will keep doing our best to point out the public harms of this under-policed global speech platform.
As always, thanks for reading.