Yesterday, as I tried to sort through Twitter’s decision to ban political ads
, I got a tantalizing tip from a new source. Cognizant, the professional services company I have spent much of this year investigating over the dire conditions of its workplaces, was exiting the content moderation businesses.
I chased it down, and to my surprise, the tip turned out to be true
. The company announced it in an earnings call on Wednesday, without mentioning the names of Facebook, Google, or any of its other clients. Later that day, Facebook provided me with a statement from Arun Chandra, the company’s vice president of scaled operations.
“We respect Cognizant’s decision to exit some of its content review services for social media platforms,” Chandra said. “Their content reviewers have been invaluable in keeping our platforms safe — and we’ll work with our partners during this transition to ensure there’s no impact on our ability to review content and keep people safe.”
How did we get here?
As I wrote in The Verge:
In February, The Verge published an investigation into working conditions at the company’s site in Phoenix
. Moderators at the site described being diagnosed with post-traumatic stress syndrome after being subjected to a daily onslaught of graphic and disturbing images. Others said they had come to embrace fringe viewpoints after seeing videos about conspiracy theories on a regular basis. Multiple employees reporting fearing for their safety after being threatened by coworkers.
A follow-up report in June focused on a site in Tampa, FL
, where moderators broke their non-disclosure agreements to describe a pattern of mistreatment by managers. They described working in offices that were often filthy, and where cases of sexual harassment had resulted in multiple complaints being filed with the Equal Employment Opportunity Commission.
Cognizant intends to finish out their contracts, which will begin to wrap up March 1st and then wind down throughout the remainder of 2020. Both of the sites I visited are closing as a result of Cognizant’s announcement yesterday, affecting more than 6,000 employees around the world.
Cognizant’s official reason for getting out of the business is that “this subset of work is not in line with the company’s strategic vision,” which could mean anything. Bloomberg, citing various analysts, said that over time the company has gotten worse at sales
— particularly in attracting digital businesses like tech platforms. (It still made $499 million in profits last quarter, on the backs of thousands of employees making $15 an hour.) The Business Standard reported that Cognizant earned between $240 million and $270 million annually from content moderation.
A memo from CEO Brian Humphries to all employees that someone sent me let them know that, while thousands of jobs would be eliminated, Cognizant would make a donation intended to spur the development of machine-learning systems that can take the place of human moderators:
While we intend to exit this work, we recognize that cleansing the web of objectionable content is a worthy cause and one in which companies have a role to play. For this reason, we have decided to allocate $5 million to fund research aimed at increasing the level and sophistication of algorithms and automation, thereby reducing users’ exposure to objectionable content.
It was not clear where Cognizant plans to make that donation.
Facebook said it would make up for the loss by increasing the number of moderators it has working at a site in Texas, which is operated by Genpact.
Twitter wouldn’t tell me how heavily it relied on Cognizant, but a spokeswoman said: “The team is on it and working through the changes to make sure we’re supporting the people doing this work while also prioritizing keeping people on Twitter safe.”
Google, which uses Cognizant for moderation services in Poland and India, among other places, did not respond to my request for a comment.
Moderators that I heard from over the past day reacted in various ways: with anger at losing their jobs, with shock at the suddenness of the announcement, and with relief that the sites where they were traumatized are going away.
“It is a mental and spiritual relief,” one former moderator at the Tampa site told me. “I still have nightmares about the content, but that will eventually go away.”
One source who worked as a manager told me that they believed Cognizant had acted to reduce its legal liability at a time when vendors are beginning to face lawsuits
from former moderators who now have PTSD, along with a spate of sexual harassment complaints.
On one level, Cognizant’s exit from the moderation business probably won’t change much at the big tech platforms. There are plenty of other vendors to choose from — and as long as companies are offering $200 million contracts, as Facebook had for Cognizant, there always will be.
Still, the move speaks to the severe difficulty of this work, and the serious toll it takes day to day on thousands of people. The toll was so severe that a huge consulting company decided to quit the business rather than work to develop a fix. Maybe it felt like it couldn’t, given the constraints that the platforms put it under. (Facebook dictated nearly everything about the Cognizant contract, down to the office decor.)
A year ago, Vice
called content moderation at Facebook “the impossible job.
” This week, Cognizant announced that it had, indeed, found the work impossible. It’s not just Facebook that gets in over its head sometimes. It turns out, its vendors do too.