Every day of late has been tough for Facebook but this Monday was particularly rough.
First, an exclusive Guardian article
quoted third-party moderators as having witnessed “an increase in hate speech and racism, not only in our queues but also amongst ourselves” as a result of the platform’s decision not to act on Donald Trump’s violent speech. Then a new report
out of New York University criticised the platform’s decision to underfund and outsource its content moderation and suggested it double its resources to 30,000 at whatever cost. Ouch.
Having attracted widespread media coverage
throughout the week the report — ‘Who Moderates the Social Media Giants?
‘, authored by NYU’s Paul M. Barrett — is likely to put pressure on all social networks to reconsider their subcontracting of content moderation to the likes of CPL, Genpact, Accenture and others. It also touches on a lot of issues discussed in this newsletter
over the last 18 months.
I recommend reading the 32-page report in full but, for those that don’t have time, I’ve tried to summarise the main points below (with links to past EiMs where relevant):
- Barrett’s central thrust — and one that I agree with — is that outsourcing leads to worker exploitation and, where content moderation is concerned, ‘jeopardises optimal performance of a critical function’ (EiM #41). He argues convincingly, in the case of Facebook, that its $70.7 billion in 2019 revenue is "robust enough to allow the company to achieve better, more humane content moderation”.
- He focuses on Facebook’s moderation efforts as 'the largest competitor in its segment of the industry’ and 'a trend-setter in content moderation’. However, like others, it refused to provide access to any of its moderation sites for the report.
- The solution to a rise in hate speech, violence and fake accounts is three-fold, according to Barrett (who is also the assistant managing editor at Bloomberg):
- Bring content review closer to the core of their corporate activities
- Increase the number of human moderators (even while continuing to refine AI screening software)
- Elevate moderators’ status to match the significance of their work
- Barrett also includes eight practical recommendations for Facebook to follow, including hiring a moderator-in-chief to report to Mark Zuckerberg and expanding the number of countries that have local moderators.
- Over 32-pages, the author also touches on the Oversight Board (EiM #63), Arun Chandra’s role (EiM #27), the failings of supply-chain moderation (EiM #57), the prevalence model discussed in Facebook’s February white paper (EiM #52) and why the kind of content regulation legislation seen in Germany (EiM #13) and France (EiM #29) is not the route to go down in the US.
Have you read the report? I’m interested to hear from you about whether ending the outsourcing of content review is a) the right way to go and b) realistic. Reply and let’s continue the conversation.