Given the phenomenon it describes, it’s perhaps appropriate that the concept of “filter bubbles
” has turned out to be so polarizing. To believers, it’s self-evident that social feeds mostly show people news that confirms’ users prior beliefs, encouraging partisanship and tribalism. To skeptics, the phenomenon describes behavior that has little to do with tech and algorithms — and, they say, there’s evidence that platforms like Facebook and Twitter introduce people to a broader set of views than they might otherwise encounter.
To internet activist Eli Pariser, who coined the term and wrote a book on the subject, questions about how tech platforms are reshaping public life remain as relevant as ever. In a new TED talk
, Pariser says social platforms should be rebuilt to serve the greater good, drawing on principles from urban planning. (Civic Signals
, a NEW organization he co-founded with University of Texas at Austin professor Talia Shroud, aims to build new models that would do just that.)
With these ideas all very much in the news, The Interface’s Zoe Schiffer caught up with Pariser to talk about his new project, whether filter bubbles are real, and why banning political ads could have unintended consequences.
The interview has been lightly edited for clarity and length.
Zoe Schiffer: Your new project, Civic Signals, is based on the idea that there’s a lot to learn by thinking about social media platforms as physical spaces. Can you talk about that a little bit?
Eli Pariser: Our starting point was trying to think about what do we want platforms to be like — not just what do we want them to stop doing. We realized one of the problems in how people think about platforms in digital space is that they think about them as places where rational people exchange information. When we think about them as physical spaces, it brings alive how human beings actually relate to one another. When you think about how people relate in a physical space you think about nonverbal cues and signals and different places to relate in different ways, which are part of what’s missing in how people conceptualize digital public squares.
How does the design of a space shape people’s behavior, either offline or online?
You can watch the same group of people walk into a library and they quiet their voices and their posture changes, and then watch them walk into a bar and see how their behavior shifts. Creating expectations for how people ought to behave is important, as is putting constraints on how we relate to each other.
William Whyte has this great extended rant about park benches. He hated them, because you when you’re sitting with someone you’re always either too close or too far away. He compares that to those little metal chairs that some cities now have in public spaces. When people sit in those, they typically shift them a couple inches. He took that as this statement of ownership and dignity, ‘I get to accustom this space to myself and with it the nature of the relationship to whoever I’m talking to.’ I think that’s another piece that a lot of digital platforms lack by being — by necessity — very one size fits all.
How does that translate to the digital world?
Well, spaces not only shape how we act individually, but they shape how groups of people interact. There’s a certain type of conversation you can have in a small cozy room that’s not as possible in a large crowded environment.
I think it’s really important because there’s this naive view of freedom, that freedom means having the most options at your disposal. But what we know is, it’s impossible to choose between a million different options. To actually make choices and have agency, it’s important to have some structure and be able to see what the options are, and that’s not possible when it’s a free-for-all.
Part of what I’m trying to argue for isn’t for one structure that serves everyone. There’s lots of different types of buildings and rooms that serve different purposes. But these vast open expanses have limited value. People react to them in antisocial ways because of the sense of the level of noise and the sense of overwhelm they feel.
There’s been a lot of research on filter bubbles — the term you coined almost 10 years ago — and how algorithms impact us. I know some people have questioned whether they really have as big an effect as you say. Has it changed your thinking on that early work at all?
We know that there is an effect. What we are learning is some people are super bubbled, and some people are not. It’s not a political statement, either — it’s across the spectrum.
If you think about the fact that people who have fewer friends on Facebook tend to be older, which correlates perhaps with them being more conservative, and following pages of conservative media outlets. They might be getting a more lightly algorithmically filtered feed, because there’s just less stuff to filter than for people who have thousands of friends. But on the other hand, a lot of what they’re seeing are page posts from outlets that reinforce what they believe. Now think about that, versus someone who is a news junkie but has a much more algorithmically filtered feed. The effects vary by where you stand in the system, but they’re almost impossible to assess because people can’t research Facebook. We need to be able to research the biggest and most powerful platform in human history.
This work has led to new legislation — The Filter Bubble Transparency Act — that’s aimed at forcing big tech companies to disclose how their algorithms work (although it’s questionable whether it would actually do that). Do you think that’s going to be effective?
I read [Adi Robertson’s] piece on The Verge
, but I support what the senators are doing — in the sense that any effort to get Americans to think about and understand the basics of how algorithms work is really important, at least as a first step.
All of us that are online all the time can forget that most people haven’t gotten their heads around the basic mechanics of these platforms. The law is necessary but not sufficient. As a public education effort, I think it’s a good step.
One of the assumptions I had when I wrote The Filter Bubble was that some of the problems we’re seeing in civic discourse in society are really issues of exposure. But I’ve come to believe that’s not true. As a liberal, when I read Fox News, it confirms my bad opinions of Fox News. The research shows that it’s not just another whether we come into contact, it’s about how we come into contact. It makes all the difference. It’s all about the design.
So how do we build healthier spaces — online and in the real world?
One way that people think about building healthy places is the built environment, what exists where and what the design of the space is. But then they also think about what people are doing in those space: the programming, and who’s leading and taking responsibility for what takes place.
When I think about how platforms are structured, there’s a lot of focus on code and design and what’s physically possible. In the real world, there’s a difference between the law and physics. You can throw a brick through a window, even though it’s illegal to do so. But in the digital space, those things converge. If I say you can’t throw a brick through the window, you actually can’t. But there’s a lot less focus on soft social infrastructure in digital spaces, and that’s really important. The questions of what are people doing here, who is leading and showing people what behavior is invited matters.
I do think Reddit has approached this in a more thoughtful way than many platforms. Subreddits have clear sets of rules and moderation. That makes for some better conversational spaces than a similar-sized Facebook group or Twitter community. It’s not surprising that coders want to code, and don’t want to think about human social organizing. But it’s a really important part of how we move forward from where we are now.
What’s the business model for a healthier digital space?
I think we need both private platforms that are more public friendly, but also platforms that are publicly owned where people feel like they have real ownership. Because people behave really differently when they own something. They take better care of it. Right now, nobody feels like they’re responsible for picking up the trash, so there’s a lot of trash around.
My hope is really just to start a conversation about what our aspirations are for our digital life and how to build spaces that more accurately embody them, and then also inspire people who are building things to build those things a little different. We’ll see if that happens, but it would be exciting if it does.
If you were going to extend the cities metaphor a little further, how does it relate to the debate that’s been raging on Facebook and Twitter about letting politicians lie in political ads on these platforms?
So if we think about ads like an amplifier on a stage in a public space that you can plug your mic into, then you would want to think about who has access to that mic. If everyone can bring their own amp and turn it up as high as they want, it drowns out the ability to have a thoughtful conversation. Part of that points me towards a personal view that like, I worry about some of the consequences of turning off political ads entirely. The ability to reach the public is especially important if you don’t already have channels to do that. But that stage is not being managed well. So I’m sympathetic to the notion that if people are running onstage and yelling profanities, we need to deal with that problem before we open it back up again.