View profile

How Privacy and Disinformation Are Related

Dispatches from Editor-in-Chief Julia Angwin
This Week
Hello, friends,
Thank you so much to those of you who filled out last week’s survey. (And for those of you who didn’t, I would be hugely grateful if you took a few minutes to fill out our multiple choice form.)
It was a real pleasure to learn about you all by hearing from you directly, rather than tracking when you open your email or where you open it from, which is the current standard for how other email newsletters assess their readership.
At The Markup, we do not track readers. And this week, our anti-surveillance approach got a huge boost when Apple said it would enable a new feature—Mail Privacy Protection—that prevents email senders from tracking readers through invisible pixels. As Casey Newton wrote in his newsletter, Platformer, “Mail Privacy Protection is likely to spur publishers to find other ways to understand their audiences.”
For us, those “new ways” are pretty old-fashioned: connecting directly with our readers through surveys, focus groups, and direct conversation. I’m still digesting all the great information that you provided in the survey, but one thing that I took away loud and clear was that many of you are very interested in the topic of privacy.
Luckily, I happen to have recently had a great conversation with Vivian Schiller, executive director of Aspen Digital at the Aspen Institute, about how privacy and disinformation are related—and I thought it would be interesting to share it with all of you.
Schiller is a longtime media and technology executive who has served as president and CEO of NPR, global chair of news at Twitter, and general manager of among many other positions. At the Aspen Institute, she leads the Commission on Information Disorder, which is studying how to combat mis- and disinformation.
Here is our conversation, edited for brevity and accuracy. (You can listen to the full podcast on Spotify.)
Dana Amihere and Getty Images
Dana Amihere and Getty Images
Schiller: Privacy is a really complicated topic, especially when it intersects with technology. You’ve been reporting on this for probably longer than you would care to admit, so can you just start with a 50,000-foot overview of the landscape of digital privacy and how it’s evolved over the last 20 years?
Angwin: Privacy is such a confusing word because it feels like we’re talking about, “I just want to go into my room and close my door and be alone.” But what most of us are talking about when we’re talking about privacy is the increasing use of technology to surveil parts of our lives that were previously not witnessable.
Twenty years ago, when you walked down the street or browsed the web, you really had no expectation that anyone else was going to see what was happening. But technological advances have meant that there are cameras and other technologies that collect data about human behavior at a scale that was previously impossible. 
And then on top of that, you have two different trends, which are governments are using that data to control and surveil their citizens and a whole new commercial data exploitation market that buys and sells personal data. 
So suddenly we’re in a world where our expectations of privacy and anonymity in public spaces have been heavily eroded. 
Schiller: You often hear people saying, “Oh, somebody is listening to my conversation when I have my iPhone nearby because I saw an ad about shrimp after I was talking about making shrimp for dinner tonight.” 
But that is probably not the case, and it’s really more complicated than that. It’s about our behaviors and our movements around the web and the kinds of things that are being collected about us.
Angwin: Right. One thing that is important to know is that our human behavior is more predictable than we think. So they may be able to tell from your grocery orders that an ad for shrimp is going to be appropriate for you. Perhaps you actually eat shrimp every Tuesday. So sometimes these models are way smarter than we think.
And then sometimes they are of course way dumber. Everyone has probably had the experience of having the item that you just bought follow you around in an ad for two weeks. And you’re like, “Dude, I already bought you. Go away.”  
There’s a whole economy around predicting human behavior. We are the subject of thousands of predictions all the time. 
Schiller: So let me, let me put up a straw man that you’ve heard before, and I’ll let you knock it down. Why do I care? I haven’t done anything wrong. I’m not a criminal. I’m not a drug dealer. I have nothing to hide. 
Angwin: The truth is that the answer is twofold. One is that you have done something wrong. It’s illegal for instance, in Maine, to possess lobsters that are mutilated. There are actually a lot of laws that you’re not aware of.
One of the things that we rely on in society is that although there are a lot of laws, they’re not all enforced. Many rules are discretionarily enforced. And as we know from our policing environment, those laws are often enforced against the most vulnerable communities. People of color and poor people are often the ones who bear the brunt of prosecutorial discretion. 
And so when we say, “I haven’t done anything wrong,” what we really mean is, “I’m in a group of people who are privileged enough to not be subject to prosecutorial discretion.”
The other thing is that this type of data has been used historically in really quite terrible ways. After 9/11, the FBI went to the Census Bureau and illegally obtained lists of where every Muslim in the U.S. lived and then heavily surveilled them for a decade, causing a lot of trauma and false accusations in that community. 
Data about groups of people have always been abused by powerful people. And so it’s incumbent upon us as citizens to constantly rein in those authorities and prevent them from abusing their powers.
Schiller: Now, can you help me connect the dots between this data collection and privacy concerns and mis- and disinformation. What do the two issues have to do with each other? 
Angwin: Yeah, it’s such a good question, because they don’t seem immediately related.
I’m going to come at it from two different ways. The first way is to recall that when I wrote my book, Dragnet Nation, I tried every way to get out of what I called dragnet surveillance. In other words, I wasn’t trying to evade the FBI, I was just trying to get out of indiscriminate tracking that is everywhere.
I did all sorts of things like getting a burner phone and fake identities and different types of accounts. And what I found was that I was really engaging in quite a bit of disinformation. I learned some of these techniques from my teenage daughter, who like every teenager, has multiple Instagram accounts. They are called finstas, as in fake Instagram accounts. They have different ones for different groups of friends and different personas in each one. 
This is a coping technique for a world of relentless surveillance. So it’s worth noting that the world of surveillance creates a need and a requirement for everyone to engage in a bit of disinformation.
Then you basically have to look at the fact that digital technology has allowed the creation of propaganda on mass scale. In the past, governments were the only ones who could afford to do really aggressive propaganda. The U.S. used to fly over other countries and drop leaflets promoting democracy from an airplane—and little pieces of paper would rain down.
Now, of course, anyone can make propaganda, and so it’s become a bit of an industry. The industry needs to know who to distribute propaganda to—and that’s where the data exploitation market comes in.
You can find vulnerable people because you can buy lists of them. When people used to send junk mail, these lists were called “sucker lists.” These were people who are going to fall for your scam. Now, of course, you can buy any kind of sucker list on Facebook or Instagram. 
So it weaponizes disinformation and lies because they can be sent directly to the people who are most vulnerable to receive them.
Last year we wrote a story at The Markup revealing that Facebook had an ad targeting category for people “interested in pseudoscience.” It was literally a suckers list.
Schiller: Explain how that list comes to be? It’s extracted by their behavior, right?
Angwin: This is the problem. We don’t really know how Facebook decided that there were 78 million people interested in pseudoscience. Maybe they clicked on something. After all, Facebook not only tracks what you do on Facebook, but they also track what you do across the web and on your phone.
Honestly, if they’re wrong, it doesn’t really matter. They have an interest in over-categorizing people in the hopes of giving advertisers what they want. 
Schiller: O.K. So what do we do about it?
Angwin: Most of the platforms offer you what I call an illusion of control. You go into some sort of privacy menu, and you can turn all these knobs and dials. The studies have shown that the more knobs and dials there are, the more you feel you’re in control and the more willing you are to accept privacy violations. 
So they offer you these very confusing settings, and you move them around and you think, “Oh, I’ve really solved this.” The truth is, not really, you haven’t. 
Schiller: There seems to be growing awareness and under the Biden administration, a greater interest and willingness to take aggressive action. What remedies are on the table that you find promising? 
Angwin: One thing that I think is really important to think about is whether the benefits of microtargeting really outweigh the risks. The idea of categorizing all human behavior and then allowing commercial interests to use those to target individuals: Are the risks to society higher than the benefits to advertisers?
The idea that the Russians could come in, which literally happened in 2016, and buy ads to target vulnerable people, to convince them not to vote or vote for Trump. Is that risk worth it just to have the right sneakers follow you around the internet? 
As always, thanks for reading.
Julia Angwin
The Markup
From The Markup
Can Schools Police What Students Say on Social Media?
Dark Patterns that Mislead Consumers Are All Over the Internet
Uber and Lyft Experiment with Labor Practices Amid Driver Shortage
P.S. To receive the latest from our Citizen Browser project, sign up here. And so you can keep up on all the news from The Markup, sign up here, and we’ll email you every time we publish about the ways powerful actors are using technology to change society, usually two to three times a week.
This email doesn't track you when you open it or click on any links. To learn more read our Privacy Policy.
If you don't want these updates anymore, please unsubscribe here.
If you were forwarded this newsletter and you like it, you can subscribe here.
The Markup - The Markup P.O. Box 1103 N.Y., N.Y. 10159