Angwin: Sometimes it feels like journalists like myself and my colleagues have just become content moderators for the big tech platforms—alerting them to content that breaks their own rules. How do you view this outsourcing of content moderation?
: Journalists and scholars are doing the moral and ethical work for these companies for zero compensation and a complete lack of acknowledgement. In fact, in some cases there is a total denial that the problems that we are articulating and pointing to are real. But we know they are responding to our critiques. In fact, Google opening up an AI ethics consulting division
two weeks ago is evidence of their responding to their critics.
Angwin: And it’s not easy work. Search results, Facebook posts, all seem to just exist for a moment in time. It’s ephemeral. How did you start looking into this issue?
Noble: When I first looked at “Black girls” in Google search, it was late 2009. At that time, the top result was hotblackpussy.com, and I remember thinking, “This is wrong. This is terrible.” And then I kept an eye on it over the course of a couple years. By 2011 hotblackpussy.com had gone out of business and sugaryblackpussy.com had replaced it. That’s when I started getting systematic about how I was going to research it.
The challenge then was not only that the results were changing but I also needed to figure out how to have multiple ways of recognizing whether this was a steady and persistent representation or whether it was specific to my locale. At that time, the majority of my professors were telling me, “That’s not a thing.” They were saying that it was “user error”—which is of course what most people think when they come across something wrong. That’s also by design.
I knew that as my work started getting out there, the results would change. It’s a constantly moving landscape—the earth is shifting under our feet, but that doesn’t mean we are not on the earth.
It was a decade ago. At that time, people’s ears couldn’t hear and process what I was saying when I said these algorithms are discriminatory. People were adamant that search only represented what was out there.The primary response from professors was there was no way this is happening at the level of code because code is just math and math can’t discriminate. So now fast-forward 10 years, I find that we do have a totally different ear—we understand that programming is a language and languages are subjective, and now we are able to talk about these things with more ease.
Angwin: You have written powerfully about Google’s ability to shape our idea of the truth. But many people still think of Facebook when they think about misinformation.
Noble: People use social media for their news, but they use Google for their facts. It’s interesting to watch the public become more aware of social media manipulation, but many forget that as people are trying to test the veracity of things they find in social media, they are going to Google like it’s the objective truth fact-checker, which couldn’t be further from the truth.
Angwin: You have advocated for a public interest non-commercial search engine that could be administered by librarians. Assuming the funding was there for that, can you describe what you are envisioning?
Noble: Librarianship has a long and storied history of narrating, collecting, and curating history’s winners, the colonizers, the imperial, the people who are invested in framing the world through their own eyes. Having said that, we understand that libraries hold a higher standard to the kinds of things they collect, if not because, just practically speaking, the brick and mortar building can only hold so much, which is different than the vastly expansive internet.
I’ve often tried to convince large libraries to think about what their role could be in curating the open web and differentiating knowledge from advertising from propaganda. The challenge is that Silicon Valley has given us the vocabulary words for how to think about information—and the prevailing word is “content.” But content flattens the distinctions between propaganda and evidence-based research and many other kinds of knowledge or information.
What we need are counterweights—resistance to that kind of instant-gratification model. I liken it to the slow food versus industrialized fast food models. There is something healthier, better for your community, better for your body and mind when you take a slower approach to learning, knowledge, and information gathering compared with mass-produced content that is cutting every kind of corner.
Angwin: Would breaking up Big Tech improve things?
Noble: Where my work is going now is, I’m really trying to write and research how to shift the paradigm around Big Tech. I’m using two other eras of history—the breakup of Big Cotton, which was predicated on the slave trade, and the era of Big Tobacco. I’m looking at Big Tech through a similar lens.
We have a narrative that every dimension of our economy is propped up by technology so we could never roll it back. I think of myself as a tech abolitionist working in the tradition of previous generations of abolitionists. I am trying to do culture-making work, and narrative that will make these connections so people don’t have to feel totally dominated by the idea that there could be no other way.
Angwin: What does tech abolition mean?
Noble: I think that the tech sector owes trillions of dollars to publics all around the world for its extractive and harmful business practices. There would certainly have to be a strong element of not only breaking up these large monopolies but also transferring the wealth that has been extracted back to the public.
Noble: Seeing the duchess and Gloria Steinem talk about what’s happening in society and linking my work to their concerns was deeply touching because I respect the work of both of them. Scholars want nothing more than having our work reach broader publics and not just having our work stay sequestered in the academy.
In many ways I relate so much to Meghan Markle’s personal story of being a working class woman of color from California. I also had a black parent and a white parent. I also married a prince [laughs heartily] who adores and supports me in my work on gender and racial equity.
In the interview she said it hadn’t dawned on her that this level of misinformation about Black girls was happening, but then she realized that of course it was happening. That is what happens when people come across the feminist research—it validates our own lived experience. There’s really nothing more exciting than when we all link up our advocacy work—including how I feel about your work, Julia—to effect change. It’s an honor to be in the company of other women trying to do their part to make the world better.