Don’t be moderate with moderation
“Collectively, the employees described a workplace that is perpetually teetering on the brink of chaos. It is an environment where workers cope by telling dark jokes about committing suicide, then smoke weed during breaks to numb their emotions. It’s a place where employees can be fired for making just a few errors a week — and where those who remain live in fear of the former colleagues who return seeking vengeance.
It’s a place where, in stark contrast to the perks lavished on Facebook employees, team leaders micromanage content moderators’ every bathroom and prayer break; where employees, desperate for a dopamine rush amid the misery, have been found having sex inside stairwells and a room reserved for lactating mothers; where people develop severe anxiety while still in training, and continue to struggle with trauma symptoms long after they leave; and where the counseling that Cognizant offers them ends the moment they quit — or are simply let go.
Seriously, read the whole thing – it paints a depressing image of what happens to those charged with protecting us from the worst aspects of humanity. Many find it challenging to cope, and to cap it all, they’re only paid $15 per hour – just $4 above the minimum wage in Arizona.
It’s interesting that that this has only become such a big talking point in tech circles now that American workers are being affected. Similar reports in the past about other content moderation hubs in other parts of the world were quickly forgotten and passed without a bland blog post from Facebook
Beyond that, it raises the question of what could be done better? In the absence of A.I. good enough to do the job, humans have to moderate this stuff, and that means humans have to see it – and get affected by it.
‘Give them more support and pay them better’ is the logical answer. Some will argue that the economics for that are challenging. Facebook needs a lot of content moderators – and they’re likely to need even more in the future. And while they serve a useful function, they don’t generate revenue.
That’s a cold argument of limited merit, though. Facebook needs these moderators, and it has a duty of care to them. The Verge’s article and Facebook’s own post make clear that the company believes it looks after them well.
But as the piece’s author, Casey Newton says:
"The more moderators I spoke with, the more I came to doubt the use of the call center model for content moderation… (this kind of outsourcing) entrusts essential questions of speech and safety to people who are paid as if they were handling customer service calls for Best Buy.”
No matter how many moderators are needed, Facebook should pay them like professionals and give them the right kind of support and working environment to do the job without resorting to drugs to get through the day. If it’s a valuable role, it needs paying like one.
Sure, one day these jobs will be replaced by A.I., but isn’t that the case for many of us?