There are many criticisms of Facebook’s size, power, and business model, but two stand out for the intensity with which they are usually discussed. One is that Facebook is a dystopian panopticon that monitors our every move and uses that information to predict and manipulate our behavior. The other is that Facebook has come such a pillar of modern life that every product decision it makes could reshape the body politic forever.
Today, in an impressive flurry of news-making, Facebook took steps to address both concerns.
First, the company said it was finally releasing its long-delayed “Clear History” tool in three countries. (The United States is not one of them.) I wrote about it at The Verge
It was nearly a year and a half ago that Facebook CEO Mark Zuckerberg, standing onstage at the company’s annual developer conference, announced that the company would begin letting users sever the connection between their web browsing history and their Facebook accounts
. After months of delays, Facebook’s Clear History is now rolling out in Ireland, South Korea, and Spain, with other countries to follow “in coming months,” the company said. The new tool, which Facebook conceived in the wake of the Cambridge Analytica scandal, is designed to give users more control over their data privacy at the expense of advertisers’ targeting capabilities.
When it arrives in your country, the Clear History tool will be part of a new section of the service called “Off-Facebook activity.” When you open it, you’ll see the apps and websites that are tracking your activity and sending reports back to Facebook for ad targeting purposes. Tapping the “Clear History” button will dissociate that information from your Facebook account.
You can also choose to block companies from reporting their tracking data about you back to Facebook in the future. You’ll have the choice of disconnecting all off-Facebook browsing data, or data for specific apps and websites. Facebook says the product is rolling out slowly “to help ensure it’s working reliably for everyone.”
Some writers, such as Tony Romm here
, pointed out that Facebook is not actually deleting
your data — which would seem to blunt the impact of a button called “Clear History.” In fact, given that the data link you’re shutting off is primarily relevant to ads you might see later
, it feels more like a “Muddle Future” button. Facebook, for its part, has cloaked the entire enterprise into a section of the app opaquely titled “Off-Facebook Activity,” which could more or less mean anything.
I find it hard to get too worked up about any of this, because regardless of whether Facebook is able to take into account your web browsing habits, it it’s still going to be sending you plenty of highly targeted ads based on your age, gender, and all the other demographic data that you forked over when you made your profile. Or you could simply turn off ad targeting on Facebook altogether
, which is more powerful in this regard than any Clear History tool was ever going to be. (Here’s an account from a person who did this
Second, Facebook released the results of its anti-conservative bias audit
, in which the company asked former Sen. Jon Kyl and the law firm Covington & Burling to ask 133 conservative lawmakers and interest groups to tell it whether they think Facebook is biased against conservatives.
On one hand, there’s no evidence of systematic bias against conservatives
or any other mainstream political group on Facebook or other platforms. On the other hand, there are endless anecdotes about the lawmaker whose ad purchase was not approved, or who did not appear in search results, or whatever. Stack enough anecdotes on top of one another and you’ve got something that looks a lot like data — certainly enough to convene a bad-faith Congressional hearing about platform bias, which Republicans have done repeatedly now.
As a result of Facebook’s new, more stringent ad policies, interviewees said the ad-approval process has slowed significantly. Some fear that the new process may be designed to disadvantage conservative ads in the wake of the Trump campaign’s successful use of social media in 2016.
So, some anonymous conservatives believe that Facebook is involved in a conspiracy to prevent conservatives from advertising. That might come as a surprise to, say, President Trump, who is outspending all Democrats on Facebook ads
. But the Kyl report has no room for empirical thought. What’s important here is that 133 unnamed people have feelings, and that they spent the better part of two years talking about them in interviews that we can’t read. (Here’s a link to the published report
, which clocks in at a very thin eight pages. And here’s a helpful rebuttal from Media Matters
, which illustrates how partisan conservative pages continue to thrive on Facebook)
Despite the fact that we have no idea who Kyl talked to, or what they said beyond his meager bullet points, the report still had at least some effect on Facebook policymaking. As Sara Fischer reports in Axios
, Facebook ads can now show medical tubes connected to the human body, which apparently make for more viscerally compelling anti-abortion ads:
The medical tube policy makes it easier for pro-life ads focused on survival stories of infants born before full-term to be accepted by Facebook’s ad policy. Facebook notes that the policy could also benefit other groups who wish to display medical tubes in ads for cancer research, humanitarian relief and elderly care.
And how are conservatives using the information from today’s audit? If you guessed “as a cudgel to continue beating Facebook with,” you win today’s grand prize. Here’s Brent Bozell
: “The Facebook Kyl cover-up is astonishing. 133 groups presented Kyl with evidence of FB’s agenda against conservatives and he dishonestly did FB’s bidding instead.”
“Facebook should conduct an actual audit by giving a trusted third party access to its algorithm, its key documents, and its content moderation protocols,” Hawley said in a statement. “Then Facebook should release the results to the public.”
I asked Hawley’s people if the senator was aware that Facebook’s content moderation protocols have been public for years, but I never heard back
Anyway, Facebook wrapped up the day by announcing — in a fantastically bizarre feat of timing — that it would begin to hire human beings to curate your news stories
, just as Apple does for Apple News. (Apply for the job here
! Let me know if you get it!) This is the right thing to do — our leaky information sphere needs experienced editors with news judgment more than ever — but also one guaranteed to court controversy. One person’s curation is, after all, another person’s "bias.”
The return of human editors to Facebook, on the very day that it publishes its investigation into alleged bias against conservatives is a real time-is-a-flat-circle moment. After all, it was trumped-up outrage over supposed bias in its last group of human editors
that helped to set us down this benighted path to begin with. I want to end on something I wrote last February on this subject:
I’m struck how, in retrospect, the story that helped to trigger our current anxieties had the problem exactly wrong. The story offered a dire warning that Facebook exerted too much editorial control, in the one narrow section of the site where it actually employed human editors, when in fact the problem underlying our global misinformation crisis
is that it exerted too little. Gizmodo’s story further declared that Facebook had become hostile to conservative viewpoints when in fact conservative viewpoints — and conservative hoaxes — were thriving across the platform.
Last month, NewsWhip published a list of the most-engaged publishers on Facebook. The no. 1 company posted more than 49,000 times in December alone, earning 21 million likes, comments, and shares. That publisher was Fox News. And the idea that Facebook suppresses the sharing of conservative news now seems very quaint indeed.