View profile

Why content moderators fall for conspiracy theories

Revue
 
In discussing my story last week on the secret lives of Facebook's content moderators in America, int
 
March 5 · Issue #297 · View online
The Interface
In discussing my story last week on the secret lives of Facebook’s content moderators in America, interviewers often ask me about how the job can make workers more susceptible to conspiracy theories. In my interviews with workers at a content moderation site in Phoenix, I heard over and over again how the work environment was full of people who had come to believe the fringe views that they were reviewing. As one of them put it to me, regarding the aftermath of the Parkland shootings:
“People really started to believe these posts they were supposed to be moderating,” she says. “They were saying, ‘Oh gosh, they weren’t really there. Look at this CNN video of David Hogg — he’s too old to be in school.’ People started Googling things instead of doing their jobs and looking into conspiracy theories about them. We were like, ‘Guys, no, this is the crazy stuff we’re supposed to be moderating. What are you doing?’”
Now journalists are beginning to investigate the mechanisms by which this change in views can happen. Today in The Verge, my colleague Mary Beth Griggs talks to Mike Wood, a psychologist at the University of Winchester who studies the spread of conspiracy theories. Wood says that existing research does not assess the effect of repeated exposure to conspiracy views on people’s beliefs. But research does show that people become more susceptible to fringe views when they are experiencing stress, he says:
Conspiracy theories do associate with stress. Basically, there’s been some research that’s showed that when people undergo a stressful life event — something like death of a family member, divorce, major disruption to their lives — conspiracy theories are more likely in that circumstance. So there is some indication that psychological stress can put people in this place where they’re looking around for new answers or they’re possibly trying to come to grips with the world in a new way.
We’ve got other research showing that when someone doesn’t feel in control of their life or in control of what’s happening to them, conspiracy theories seem more plausible, and that might have been what’s happening with these people. I’m not sure what their subjective psychological experience was at the time, but there is some data that suggests that can happen.
As I document in my piece, work as a content moderator is highly stressful. Workers’ time is managed down to the second, Facebook’s instructions about how to moderate individual posts can vary on an hourly basis, and making just a few mistakes can be enough to put a workers’ job at risk. Given that level of duress, it’s fair to wonder whether it couldn’t be a factor in workers’ likelihood to start believing in conspiracy theories.
Meanwhile in One Zero, the new tech publication from Medium, Erin Schumaker talks to experts who speak to the power of repeated exposure to warp the human mind.
“The more often you see it, the more familiar something is, and the more familiar something is, the more believable it is,” says Jeff Hancock, communication professor and founding director of the Stanford Social Media Lab.
Conspiracy content is engineered to be persuasive. People accept these theories because they help make sense of a world that feels random; even if they seem far-flung to the rest of us, they can offer some sense of comfort or security. And seeing those theories repeatedly pop up in your Facebook news feed “starts to undermine the sense that they are fringe,” says James Grimmelmann, a professor at Cornell Law School who studies internet law and social networks.
What to do? The obvious first step is research. Facebook told me it plans to conduct a survey of contractors’ “resiliency” in coming months that will allow the company to better understand its workers’ mental health. A question or two about conspiracy content could help Facebook begin to understand how widespread the issue is.
Second, Facebook could develop training materials that prepare workers for the possibility they will find themselves influenced by conspiracies. It should be disclosed to prospective workers as a possible effect of doing the job, and counselors should be encouraged to discuss the issue with workers during their regular interactions.
Finally, Facebook could create a knowledge base of known conspiracy theories for moderators to review as they go about their work.
One reason the company began hiring Americans is for what it calls “cultural context” — the idea that Americans will already be familiar with public figures, local slang, and other regional idiosyncrasies. But in practice, many workers lack that cultural context. One moderator told me she was embarrassed to mistakenly remove a video from the conservative provocateurs Diamond and Silk. She didn’t recognize them as public figures, and saw them only as two women who appeared to be bullying someone. (That someone turned out to be Ted Cruz, a US senator, the moderator told me.)
While it would be impossible to lay out all relevant cultural context for moderators, giving workers some sort of guide to popular conspiracy theories seems like a positive next step. A worker who has been prepared for the possibility that she may find herself exposed to and persuaded by anti-Semitic conspiracies would likely be better equipped to handle them.
And who knows? Resources developed to support moderators may prove to be useful elsewhere. After all, before it reached the reviewer’s desk, it was lurking somewhere on Facebook, nudging someone further toward the fringe.

The Trauma Floor
How Roblox is moving ahead with its digital civility initiative
Democracy
Arlington says Amazon must meet specific office-space targets to get $23 million in grants
Fighting fake news: Decoding ‘fact-free’ world of WhatsApp
Google to ban political ads ahead of federal election, citing new transparency rules
Amazon’s Hard Bargain Extends Far Beyond New York
Big Tech, Once a CPAC Sponsor, Is Now Its Boogeyman
Elsewhere
Holdout Jeff Bezos Confronted by Amazon Moms Demanding Day Care
On Amazon, a Qanon conspiracy book climbs the charts — with an algorithmic push
Revealed: AmazonSmile helps fund anti-vaccine groups
A comedian’s fight with Barstool Sports shows how Twitter’s copyright system can hurt creators
Oculus Quest Enterprise Edition Coming Later This Year
Launches
Introducing We Think Digital: New Digital Literacy Resources to Reach 1 Million People in Asia Pacific by 2020
Takes
Deepfake propaganda is not a real problem
Why Napalm Is a Cautionary Tale for Tech Giants Pursuing Military Contracts
And finally ...
JetBlue asks Instagrammers to delete their pics to win a free year of flights
Talk to me
Send me tips, comments, questions, and strategies for ignoring conspiracy theories: casey@theverge.com.
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue