Since the COVID-19 pandemic began, the big social platforms have generally been quicker than usual to intervene in the spread of misinformation. We’ve seen Facebook, Google, and Twitter add various labels, warnings
, and links to high-quality news sources and public health organizations. And for the most part, the dumbest theories about the novel coronavirus have not reached huge scale — unless the theory was suggested by the president of the United States
, in which case, well.
But some cracks are beginning to show. In February, a set of bizarre and almost incomprehensible theories
began to spread on YouTube and Facebook alleging that 5G cellular networks had played a role in spreading the virus. And last week, we saw the emergence of the first true hit conspiracy video of the COVID-19 era. It’s called “Plandemic,” and like many conspiracy videos it asserts that a shadowy cabal of elites is using a global crisis as a cover to profiteer and entrench their power. Here’s Davey Alba in the New York Times
In the 26-minute video, the woman asserted how Dr. Fauci, the director of the National Institute of Allergy and Infectious Diseases and a leading voice on the coronavirus, had buried her research about how vaccines can damage people’s immune systems. It is those weakened immune systems, she declared, that have made people susceptible to illnesses like Covid-19.
The video, a scene from a longer dubious documentary called “Plandemic,” was quickly seized upon by anti-vaccinators, the conspiracy group QAnon and activists from the Reopen America movement, generating more than eight million views. And it has turned the woman — Dr. Judy Mikovits, 62, a discredited scientist — into a new star of virus disinformation.
Still, the video seems well on its way to becoming something akin to this generation’s Loose Change
. That video, which wrongly depicted 9/11 as an elaborate false flag operation, generated millions of views after being distributed for free on YouTube and local Fox TV affiliates — and went on to become one of the foundational texts of the 9/11 truther movement
I accept that on a free and open internet, some people are going to post extremely dumb and harmful things. And “Plandemic” is undoubtedly harmful: among other things, it falsely tells people that wearing a mask will “activate” the virus. But we’ve seen in the past that extremely dumb and harmful things often benefit from algorithmic promotion. They appear high up in search results, on trending pages, and in recommendation widgets. Platforms are used to recruit followers for terrible causes without even being aware that they’re doing so.
After years of pressure, though, platforms have gotten better at detecting bad posts and videos as they begin bubbling up. They’re now able to catch more bad stuff before it hits the trending page. YouTube has a whole team that monitors this stuff in real time. And so “Plandemic” left me scratching my head. How did this thing go viral?
The ground was seeded by a book
that Mikovits, the star of “Plandemic,” published last month. Plague of Corruption
“frames Dr. Mikovits as a truth-teller fighting deception in science,” Alba writes, and it won approving coverage from far-right outlets including the Epoch Times,
Gateway Pundit, and Next News Network.
But it was the “Plandemic” clip that turned Mikovits into a star (she’s gained more than 130,000 Twitter followers in a month.) And the two have benefited each other: searches for Mikovits drove views of “Plandemic,” and viewings of “Plandemic” drove searches for Mikovits.
Erin Gallagher, a social media researcher who specializes in data visualizations, offers some clues. Gallagher used CrowdTangle, a Facebook-owned tool for analyzing public posts, to investigate when “Plandemic” began to surge on the network
. She found that posts referencing it appeared most often in Facebook groups devoted to QAnon, anti-vaccine misinformation, and conspiracy theories in general.
“The video spread from YouTube to Facebook thanks to highly active QAnon and conspiracy-related Facebook groups with tens of thousands of members which caused a massive cascade,” Gallagher writes. “Both platforms were instrumental in spreading viral medical misinformation.”
YouTube and Facebook both ultimately removed the video, but their responses differed in notable ways. I spoke to representatives at both companies today, and here’s what I learned.
At Facebook, “Plandemic” was demoted before it was removed. Demotion is a step that Facebook often takes with posts that seem bad for one reason or another but are not considered actively harmful. Maybe you posted an image in which someone is almost but not quite naked; maybe you suggested that people commit violence without coming right out and saying it. Since 2018
Facebook has intervened in an effort to prevent these types of posts from spreading, as part of an initiative to make it less appealing to post so-called “borderline content.”
I don’t know exactly what qualified “Plandemic” as borderline content initially, but a spokesman noted that the video’s length — 26 minutes — along with the large number of claims made within it, created a lot of work for fact-checking teams. (A lie can get halfway around the world before the truth can tie its shoes, etc.) Facebook eventually decided that “Plandemic” had to go over its false assertion that people can “reinfect themselves” by wearing masks, but given the truly unfortunate confusion over mask wearing — some of it generated by public health organizations — the company was cautious.
At YouTube, the company saw several videos related to “Plandemic” and flagged and removed them before the 26-minute clip that became famous. That clip was uploaded on May 4th and removed May 6th. In the meantime, it generated 7.1 million views. According to the company, the vast majority of those views came from external sites — people linking to it directly, rather than seeing it somewhere on YouTube. Gallagher’s analysis suggests a significant number of those clicks came directly from Facebook. (YouTube wouldn’t comment on that.)
Facebook continues to see people upload other clips from “Plandemic,” and told me that it is sharing fact-checking information from its partners with people who share them. It’s temporarily reducing the distribution of these other clips — the ones that don’t include the mask bit — as fact checkers continue to evaluate other parts of “Plandemic.” People also continue to post modified versions of the original — recording it on their phones or adding commentary to it — and Facebook is hunting those down too.
There’s a view of all this that is heartening. Both companies saw a bad thing, put teams of fact checkers on it, and removed it from their networks with relative haste. (That’s more than Amazon can say: Plague of Corruption is a top-10 best seller there today.) Facebook and YouTube could have acted faster, or more completely, but it isn’t as if “Plandemic” caught them unawares. YouTube has more than 2 billion monthly users, and Facebook has 1.73 billion users per day across its suite of apps; at that scale, 8 million people seeing something in 48 hours just doesn’t look like all that much. (And if you’re thinking well huh, maybe the problem with these companies is their size, you may have been interested in Elizabeth Warren’s presidential campaign.)
But there’s a darker view to consider, too. When Facebook announced it would shift its attention to building services for smaller, more private groups, critics pointed out that this was likely going to make it harder to police misinformation. This is particularly true of WhatsApp chats, which are encrypted end-to-end. But it’s also true of private Facebook groups, where it seems likely that “Plandemic” was shared actively.
It likely won’t be the last piece of harmful misinformation about COVID-19 that becomes a blockbuster. And when the next one comes, I wouldn’t be surprised to see that the pathway to virality leads straight through Facebook groups.