Going deeper underground
The New York Times reports
on how cleanups on services like Facebook and YouTube are pushing hateful figures like InfoWars’ Alex Jones away from the public eye. People shunned from the mainstream are congregating in private Facebook Groups, WhatsApp groups, and other not-quite-so visible communities.
From the NYT:
“They’ve essentially empowered very large groups that can operate secretly without much governance and oversight,” said Jennifer Grygiel, an assistant professor at Syracuse University’s S. I. Newhouse School of Public Communications. “There may be harms and abuses that are taking place, and they can’t see.”
This should be a concern. We only need to look to India to see how the under-the-radar nature of WhatsApp can allow damaging false information to travel in an uncontrolled way. Facebook can delete an inappropriate post, but if a message is manually forwarded around between individuals and groups on a messaging platform, it’s much harder to stop.
Facebook thinks it can stop the problems occurring inside private groups:
“A Facebook spokeswoman said the company used automated tools, including machine learning algorithms, to detect potentially harmful content inside private groups and flag it for human reviewers, who make the final decisions about whether or not to take it down. The company is developing additional ways, she said, to determine if an entire group violates the company’s policies and should be taken down, rather than just its individual posts or members.”
In response to the problem of viral misinformation in India, WhatsApp has introduced some limits
on how people in the country can use the forwarding feature. But technical solutions aren’t everything, and WhatsApp has had to run newspaper ads
to help educate users. Children are even being taught about the problem in school
have been linked to misinformation being spread via WhatsAp in India. Given how we’ve seen more easily trackable lies on social aiding genocide in Myanmar
, and leading to incidents like the Pizzagate ‘self-investigation,’
it’s clear that removing hateful content from the surface isn’t going to scrub it from the internet entirely.
As this stuff goes further underground, it becomes less of a moderation and community management problem, and more a reflection of deeper ills in society. There may not be much that technology companies can do about that.