Angwin: So let’s start with the 2020 U.S. presidential election. Two years ago you blew the whistle about how Cambridge Analytica harvested data from 50 million Facebook users and used it to boost the Trump campaign in the 2016 U.S. presidential election. How do you think it went this time around?
Wylie: This time around was different. In round one, people weren’t paying attention to the problem. When I first came out as a whistleblower, disinformation wasn’t a word people talked about. I think we spent the next four years trying to unpack what happened in 2016.
In 2020, although we are paying more attention to it and we have a more mainstream vocabulary for it, the fundamentals are the same. There was no national response to scaled disinformation. There were no regulatory and legal changes. So we have a lot of the same players involved, namely Facebook, who we are currently relying on to police our elections. I’m very uncomfortable with a private company taking on what I think should be a public role.
What we are also seeing is that disinformation does not have to be about politics. You see that with COVID. People are susceptible to misinformation, and that can have real harms. We’re more aware of what is happening, but we haven’t had many substantive changes.
Angwin: This report contains 250 recommendations (!) on how to stop disinformation, so we won’t get to all of them. But I’d like to start with some of the recommendations about how to build better tech. Can you explain what you mean by a “digital building code”?
Wylie: This comes from my experience with dealing with regulatory agencies and also talking to members of Congress and parliaments and noticing the difference in language being used. When I would speak to members of Congress, they were using the language of Silicon Valley—describing everything as a service.
But if you are actually working in tech, you don’t think of yourself as building a service. You think of yourself as constructing things. And construction is often governed by things like the precautionary principle and risk mitigation. So if we think of these technologies in terms of architecture and engineering, the question becomes, Why are you allowed to release something without testing for safety and testing what the harm could be?
Currently, if I wanted to release a toaster, there is a greater regulatory burden for me to prove it is safe than it is for me releasing a digital platform.
Wylie: It’s interesting to notice that the arguments that automakers made against air bags and other safety measures were similar to what Silicon Valley says today: that consumers are opting in and that regulation would inhibit innovation.
But the difference with autos was there was a countervailing force in insurance companies that have to pay for mangled bodies. There is no one who has to pay for mangled elections. You pay as a citizen.
Angwin: Ah, yes, let’s talk about legal liability. It is another one of my favorite topics. Cybersecurity experts such as Bruce Schneier have suggested that liability could lead to safer software. Tell me about your plans for creating liability for software.
Wylie: There is a section in the report about introducing the concept of software malpractice and that if you don’t live up to a minimum set of standards that are articulated in the law—whether it’s safety or quality or what have you—you could be liable for malpractice.
I like to think about it using this logic of construction or architecture. Imagine if you had a building and it had an arsonist in it. If you are an architect, you are not liable for the actions of the arsonist. But you would be liable if you didn’t put enough fire exits in the building or you used extra flammable paint.
What is interesting about looking at it through professional standards is that you can sidestep the debate about
section 230 and censorship. Instead, we can just say that if the way you design a platform causes harm, even if that harm was originally started by a user, if your design was unreasonable or unsafe, you should be liable.
If you look at
what happened in Myanmar, the fact that there were virtually no consequences for a platform used in crimes against humanity is outrageous. Imagine if we could give a voice to people around the world to help create minimum standards.