This week, I’m excited to share with you our latest ambitious project—a democracy diagnostic tool focused on Facebook and YouTube. Before I dive into the details, however, I wanted to share a little bit of the philosophy behind why we developed it.
Social media platforms are the broadcasting networks of the 21st century. Like traditional broadcasters, social media platforms choose—through their algorithms—whether to amplify or suppress the propaganda and disinformation increasingly being pumped through their systems.
Certainly the journalists who curated the news back in the analog era—the TV executives and front-page editors—were not infallible. Far from it. They were primarily White men who had a narrow view of what constituted news. The #MeToo and Black Lives Matter movements were born of a moment when White men no longer exclusively define what is news.
And most important, no one can even see what choices their algorithms are making across a community because no two people see exactly the same content. Users only see what gets amplified in their individual algorithmically curated news feed.
I believe that the public deserves to know what content the black box algorithms choose to promote. That is why we have been working for months, and are excited to finally announce publicly, The Citizen Browser Project
—an initiative designed to audit the algorithms that social media platforms use to distribute news and narratives to different communities.
At the center of our project is a nationally representative panel of users, picked with the help of a survey research firm, whom we will pay to install a custom web browser. The browser allows us to monitor what content is being promoted to them in their social media feeds.
We have built this custom web browser with privacy protections. It will not collect user logins and passwords, and it will automatically strip out personally identifiable information before we analyze the data—meaning we’ll never see those identifiers.
This is a unique way to collect data about social media companies that allows us to ask questions that cannot otherwise be answered: What political stories are being pushed to Black voters in swing states? What kind of health information is being targeted at senior women? What kind of groups are being recommended for younger voters in Texas?
We will be working with reporters at The New York Times to analyze the data and track propaganda and disinformation.
We hope that this ambitious (and hugely expensive, I might add!) project will be a key tool in allowing the public to audit the algorithms that dominate our civic discourse. We are grateful to the Trusted Elections Fund for seed funding for our pilot project. If you think we should know more about how these platforms work, we would welcome your support
Thank you, as always, for reading.