View profile

Introducing Split Screen

The latest findings from The Markup's Citizen Browser project
This Week
In May 2016, the U.S. presidential election was in full swing. Bernie Sanders was battling Hillary Clinton for the Democratic nomination, and the Republican candidates were dropping like flies so that by the end of the month Donald Trump had secured enough delegates to lock up the Republican nomination. Political pundits declared that America was as divided as it has ever been. 
Those were the days. 
That month, I published an interactive graphic in The Wall Street Journal called “Blue Feed, Red Feed” that aimed to illustrate the depths of the political divide as it appeared on Facebook.
We all knew that everyone’s Facebook feed was different—but I wanted to let readers actually have the emotional experience of seeing what a person from the “other side” was reading—to participate in a kind of ideological tourism.
"Blue Feed, Red Feed," WSJ.com
"Blue Feed, Red Feed," WSJ.com
Using replication data from a study released by Facebook’s own researchers, I was able to make a simulation of two partisan feeds. Focusing on the news sources most shared by the study’s most partisan users, we simply showed a keyword-filtered view of what a feed populated by these news sources would look like. But it was only a simulation. 
Fast-forward to the present day. A lot has changed. 
After the Cambridge Analytica scandal, Facebook locked down many of the developer tools that I had used to build things like “Blue Feed, Red Feed.” Facebook does grant access to researchers and journalists to study its internal data, but that usually involves nondisclosure agreements, and that hasn’t always gone smoothly
And now here I am working at The Markup, and we have Citizen Browser, a unique data collection project that lets us—for the first time—actually see what the feeds of people from different demographic groups and of different political affiliations look like. 
As our editor-in-chief, Julia Angwin, likes to say, “Do every story again.” So the Citizen Browser team set out to do an update of “Blue Feed, Red Feed” using real instead of simulated data. 
The resulting tool, Split Screen, shows news articles, hashtags, and recommended groups from more than 2,500 actual peoples’ feeds. (Read the methodology here.)
Split Screen provides a one-of-a-kind insight into the the so-called “filter bubbles” or “echo chambers’‘ that social media users can find themselves in, where all the news that they see is tailored to reinforce their existing beliefs
These divided worlds constitute one of the most perplexing phenomena of social media today. Researchers (including Facebook’s own) have looked to see if there is evidence that this phenomenon is caused directly by the social media platforms’ algorithms, but the evidence is elusive—and sometimes contradictory. Largely, user behavior and biases are identified as the main driver of such polarization, especially for users at the fringes of the ideological spectrum. Whatever the cause, this effect comes with huge ramifications for civil discourse and even democracy itself. 
Americans today are increasingly divided, from our politics, to the news sources we trust, to the communities we seek to join online. Research has shown this divide is growing over time and leading to something more like tribalism. Social media platforms are increasingly the main source for peoples’ news and where a lot of this division is occurring.
On Facebook, the “news feed” isn’t the only place where these divisions are growing. In 2019, Facebook launched a splashy advertising campaign for its Groups, including star-studded TV spots run during the Super Bowl and the Grammy Awards that showed how Groups can help real people connect with others with overlapping interests. But those groups, which corral users into concentrated subcommunities, have now been shown to radicalize users and increase calls for violence, according to the company’s own internal research
Facebook’s algorithms are secretive and their decisions largely inscrutable. From the outside, the only way to start to understand the decisions these algorithms make is to observe the output: real people’s Facebook feeds. We think this tool will offer an interesting lens to examine the data we have been collecting as part of our Citizen Browser project. 
Facebook declined to comment for this story.
Check back often to see what is being shown to Citizen Browser panelists as the news unfolds across our divided feeds. 
—Jon Keegan, The Markup investigative data journalist
 
More from Citizen Browser
 
Split Screen: How Different Are Americans’ Facebook Feeds?
Official Information About COVID-19 Is Reaching Fewer Black People on Facebook
How We Built a Facebook Inspector
This email doesn't track you when you open it or click on any links. To learn more read our Privacy Policy.
In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.