A Markup investigation in January
found that the company was still pushing partisan political groups to its users, with several of those groups promoting conspiracy theories and calls for violence against lawmakers.
In an earnings call on Jan. 27, Zuckerberg assured investors that this time—really—Facebook would permanently stop
recommending political groups.
“I was pleased when Facebook pledged to permanently stop recommending political groups to its users, but once again, Facebook appears to have failed to keep its word,” Markey told The Markup after learning of our latest findings. “It’s clear that we cannot trust these companies to honor their promises to users and self-regulate.”
Political group recommendations have slowed among our panelists since our January investigation, though they have not, as was promised, been eliminated. In January, our reporting found that 12 of the top 100 groups recommended to our panelists were political. In our most recent data, from Feb. 1 to June 1, only one of the top 100 groups recommended to panelists was political. We assessed whether groups in the top 100 were political by looking at the group name, “About” page, and rules (if posted), as well as whether posts in the discussion feed mentioned political figures, parties, or ideologies.
The Markup also found 15 political groups recommended by Facebook to our Citizen Browser panelists that had “Joe Biden Is Not My President” as the group name, or some variation of it.
Two of the groups, “Not my President” and “Biden Is Not My President,” had previously been flagged by Facebook for containing troubling content—but that didn’t stop Facebook from suggesting the groups to our panelists.
The groups contained posts and memes claiming that Biden didn’t legitimately win the election, a conspiracy theory tied to Trump’s discredited claims about fraudulent voters and mishandled vote counting. In total, the groups were recommended to 14 panelists between March and April, with some groups recommended to multiple panelists.
“If Joe Biden gets in office by this cheating voter fraud, good bye America, good bye country because the Democratic party will destroy our country for good,” one commenter in the “Not my President” group wrote in December.
The memes in the “Biden is Not My President” group included an image of an empty coffin with a caption claiming that the occupant had come back to life to vote for Biden. A post in the “Not my President” group showed a screen capture of the protagonists from the movie Ghostbusters captioned to suggest they were there in case “all the dead people that voted for Biden become violent.”
The group’s “About” description includes the sentence, “Let’s see how many people we can get to really show them that President Trump won the election.” Facebook recommended the group to three Citizen Browser panelists. As of June 10, the group had 255 members.
In another “Joe Biden is Not My President” group, the admin posted a photo of a rifle last December, writing, “I won’t put up with people destorying [sic] my family’s or friend’s property. I have the right to defend myself and others.”
The group admins did not respond to requests for comment.
The memes can spread disinformation, said Nina Jankowicz, a Global Fellow in the Science and Technology Innovation Program at Wilson Center for Public Policy and author of “How to Lose the Information War.”
“I’d hope people browsing their Facebook feed and seeing a dank meme recognize it’s not an authoritative source of information,” Jankowicz said. “But when you see meme after meme after meme saying dead people are voting for Biden, over time it’s that drip-drip-drip that changes your perception of reality.”
In 2016, Facebook’s researchers found that 64 percent of people who joined extremist groups were there because of the social network’s own recommendations, according to The Wall Street Journal
. The Markup found several groups recommended by Facebook to Trump voters that organized travel logistics to Washington, D.C., for Jan. 6.
During the 2020 election, Open Source Election Technology Institute co-founder Gregory Miller said his organization had a significant amount of discussion with election administrators on how to get their messaging across about how they were keeping the vote secure. But election officials haven’t been able to fight off the wave of misinformation flooding social media, including in Facebook groups, Miller said.
He said he’s received death threats from people for debunking election fraud claims and knows many election administrators who have had their lives threatened.
“We know that election administrators have been flummoxed by the impact of social media, just for trying to do their jobs,” Miller said. “In our professional opinion, Facebook in its current form and conduct represents a clear and present danger to the safety of election administrators and the integrity of election administration itself.”
A survey from the Brennan Center for Justice
found that 78 percent of election officials said that misinformation on social media made their jobs more difficult, while 54 percent of respondents believed it made their jobs more dangerous.
“In a lot of cases, groups that were tangentially political led people to groups that were much more violent over time,” Jankowicz said. “Facebook unfortunately either does not have the capacity in terms of subject matter experts who can be on this all the time to update their classifiers, or perhaps—I’ve heard them say this over and over that they’re not going to be 100 percent successful all the time.”