View profile

The Precarious Life of Working for an Algorithm

Dispatches from our founder
This Week
Hello, friends,
Remember when we were all so worried about robots taking all our jobs? Well, they did take some of the jobs, but contrary to predictions, more in the managerial space. Algorithms have taken over the functions of a boss for everything from hiring and firing decisions to day-to-day management of workers’ routines.
The rise of algorithmic oversight has profoundly changed the workplace, which is why we launched a series called “Working for an Algorithm” earlier this year. In this series, reporter Dara Kerr is chronicling many aspects of the challenges workers face when their boss is a robot.
Her first story, in February, revealed how Postmates delivery workers were being scammed by phone callers who pretended to be Postmates employees. Since the workers had no relationship with anyone at the company beyond the instructions on their app, they had no way of knowing that the person they were speaking to was a fraud.
Several Postmates workers told Dara that their accounts were drained after they handed over their usernames and passwords to the scammers. And then they couldn’t find a human to talk to when they needed help. “It sucks to go in there blind like that, and there’s nobody to help you,” Shaleece Green told Dara.
After the article was published, Postmates reimbursed the workers that Dara interviewed. During the course of her reporting, the company also put in place a two-factor authentication system that makes it harder for scammers to log in to couriers’ accounts and steal their pay.
This month, Dara turned her attention to another kind of algorithmic worker: TikTok creators, whose ability to make money depends on whether the social media platform’s inscrutable algorithm recommends their content. The TikTokers told Dara about their struggles to understand how the algorithm worked and how that affected their ability to gain followers.
“The growing industry around TikTok resembles the promise and callousness of early Hollywood—burgeoning creativity, swift fame, and little by way of worker protections—except that instead of studios creating stars, it’s a faceless, inscrutable machine,” Dara wrote. 
To understand the challenges faced by content creators whose livelihoods are in the hands of black box algorithms, I interviewed Brooke Erin Duffy, an associate professor at Cornell University, where she holds appointments in the Department of Communication and the program in Feminist, Gender & Sexuality Studies. She has written extensively on gender and cultural production. Her most recent book, “(Not) Getting Paid to Do What You Love: Gender, Social Media and Aspirational Work,” was published in 2017 by Yale Press. 
The interview, below, is edited for brevity.
Brooke Erin Duffy
Brooke Erin Duffy
Angwin: We’ve been doing this series, Working for an Algorithm. It is about the challenge of today’s world, where either people’s boss is actually an algorithm or, in the case of content creators, their livelihood is governed by an algorithm.
What is a good way to think about the world we live in with algorithms governing everything? And do you see parallels to earlier times?
Duffy: Absolutely. I think it’s critical to keep in mind the continuity with earlier mechanisms of worker surveillance and efforts to track workers’ processes and products. There are precursors to algorithms in earlier systems of capitalist production, and especially those in the creative industries, such as metrification. 
But what’s unique and different in the context of digital media is the automation of them. And I think “automation” needs to be in kind of scare quotes because I don’t want to suggest that humans don’t play a role. Humans are essential to the design and maintenance of these technical systems. 
We’re seeing algorithms govern all forms of work, from the media and creative industries to what we consider the “gig economy” and even to academia, where researchers are attuned to keywords that “do well” and the importance of search engine optimization (i.e., Google Scholar). 
The automation, the ubiquity, and the pervasiveness have exacerbated the systems of monitoring of workers. 
Angwin: Your recent work is about the difficulty of being an algorithmic worker, and you describe something called “algorithmic precarity” faced by content creators. What does that mean?
Duffy: Precarity is a long-standing feature of the media and creative careers in the sense that there’s a mentality that “you’re only as good as your last job” or our product or whatever it is. The work has long been considered contingent, with unpredictable schedules and the gradual loss of benefits.
But, the precarity of work is exacerbated in the social media age because workers are now beholden to the algorithms that govern the social media platforms on which they either work or use to garner work. 
If you use Instagram, you may recall that a few years ago they changed the algorithm and suddenly people’s content was not being seen. Users were being shown posts by friends and people in your network from a week ago. And this was all due to the fact that Instagram replaced its chronological feed with an algorithmically curated one.
It was frustrating for us as users, but our livelihoods are not dependent on it. For media and creative workers, their entire job is structured by the command to be visible—and the algorithm comes in and suddenly renders their content invisible.
That impacts their economic stability and thus also makes them carefully attuned to how to win over the algorithm: What types of content are going to be rewarded by the algorithm, or alternatively not be punished by the algorithm?
It’s an added layer of labor that workers have to shoulder as they’re continuing to think about how to appeal to audiences and advertisers. Now they have to add: How do I appeal to the algorithm? 
Angwin: You talk about how one way that people deal with this precarity is by developing what you call “folk theories” about why the algorithm behaves the way it does. What did you find out about folk theories: Are they true or not?
Duffy: That’s the million-dollar question. I didn’t come up with the term “folk theories”—a number of academics draw upon folk theory literature—a group of Northwestern University researchers published an interesting study called “ ‘Algorithms ruin everything’: #RIPTwitter, Folk Theories, and Resistance to Algorithmic Change in Social Media.” “Folk theories” are a useful framework because they capture the fact that workers don’t really know how algorithms work, but they need to develop some sense of understanding to do their jobs. But again, they don’t really know.
This is a fundamental difference between traditional work then and work now. If you worked at a newspaper or a magazine or radio, there was always a little bit of confusion or inscrutability surrounding metrics, but there was an accountability system—someone you could bring your questions to. But for creators now, when the algorithm changes, there’s no one they can talk to. If your content is hidden or if you feel like you are being treated unfairly by the algorithm, you have no recourse. There’s no person you can talk to. 
All of this uncertainty and speculation is what leads to the creation and circulation of folk theories. This is a way for creators to try to wrest back control over their work conditions to the best of their abilities.
The problem is, even if you find the best way to trick the algorithm, it can change in an instant. These algorithms are constantly adapting and responding to user behavior. So what works today may not work tomorrow. 
And what renders this whole situation even more precarious is we’re talking about platforms that may be here one day and not the next. Vine is probably the best example of this. I mean, just under a year ago, TikTok was something that very few of us knew anything about, unless you were under 13 years old. It’s not the same level of stability that we saw in earlier creative industries.
Angwin: Right. It also seems like this algorithmic precarity is not evenly distributed. Anecdotally it seems like women and people of color are impacted more by shadow-banning and other algorithmic practices. 
You have studied how female influencers—you don’t like the word influencers, correct?—get punished more for being visible. Can you talk about their predicament?
Duffy: I prefer the word content creators. There is sort of a gendered valence to the term influencers, and influencers themselves don’t like that term because to call yourself an influencer suggests you’re influencing purchasing behavior. So I tend to say “content creators.”
And I think it very much ties into the visibility imperative that drives the social media age: the need to always be posting, to always be visible, to be putting our thoughts out there—and on many different platforms: Facebook, Twitter, LinkedIn, Instagram, TikTok…. It goes on.
And, you know, with self-branding, I do it. I counsel my grad students to do it. But with this advice, we also need to acknowledge the fact that there’s a lot of labor required to make things visible. Promotional activities require time and energy.
But the cultural directive to be visible, to put oneself out there, is certainly not experienced evenly. Much of my work is focused on gender and femininity. And so, for women content creators, putting themselves out there means toeing the line between visibility and vulnerability. But “putting themselves out there,” they open themselves up to hate, criticism, and harassment in ways that disproportionately impact women and people of color and the LGBTQ community.
In a recent paper, “Gendered Visibility on Social Media: Navigating Instagram’s Authenticity Bind,” my collaborator, Emily Hund, and I focused on the authenticity policing that dominates social media. There’s a very gendered way in which women content creators are [judged] for being “too real” or “too fake.” In either case, there’s so much negative backlash…. It puts women in this bind where they have to be both relatable but aspirational. It’s another double standard.
For women and other marginalized communities, it’s another layer of vulnerability added to work that’s already so precarious.
As always, thanks for reading.
Julia Angwin
The Markup
From The Markup
Shadow Bans, Dopamine Hits, and Viral Videos, All in the Life of TikTok Creators
Credit Card Ads Were Targeted by Age, Violating Facebook’s Anti-Discrimination Policy
Google Promised Its Contact Tracing App Was Completely Private—But It Wasn’t
P.S. To receive the latest from our Citizen Browser project, sign up here. And so you can keep up on all the news from The Markup, sign up here, and we’ll email you every time we publish about the ways powerful actors are using technology to change society, usually two to three times a week.
This email doesn't track you when you open it or click on any links. To learn more read our Privacy Policy.
In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.
The Markup - The Markup P.O. Box 1103 N.Y., N.Y. 10159