View profile

The oversight on Facebook's Oversight Board

Revue
 
One of the year’s biggest stories at the intersection of technology and power is the Facebook Oversig
 
January 29 · Issue #448 · View online
The Interface
One of the year’s biggest stories at the intersection of technology and power is the Facebook Oversight Board. For the first time, a tech giant is seeking to give some of its power back to the people — in this case, in the form of an independent 40-member board that will serve as a kind of Supreme Court for content moderation. This week, Facebook announced that the board would likely begin hearing cases this summer — it also named the person who will lead the board’s staff, and released suggested bylaws.
So, what are people saying about the news?
A complaint by a target of a bogus political ad is bound to come before the board eventually, which will certainly take on the case. Or Facebook itself might send the issue to the board. After all, this issue satisfies almost all the factors listed by Facebook itself when assessing important cases. (A subset of the board’s members will sit on a selection committee.) According to an explanation of the board’s charter written by Facebook, these include severity (“the content reaches or affects someone else’s voice, privacy or dignity”); public discourse (“the content spurs significant public debate and/or important political and social discourse”); and difficulty (“there is disagreement about Facebook’s decision on the content and/or the underlying policy or policies”). It’s almost as if the whole project was created specifically to rule on Zuckerberg’s stance on political ads.
Nick Clegg, the former UK deputy prime minister who is now Facebook’s head of global policy and communications, confirms this. “I think this political ads issue of how do you treat political speech is exactly the kind of thing where it would be very interesting to see how the board thinks,” he says.
I’m less certain the board will have a say here. It will have the authority to remove (or leave standing) individual pieces of content, as well as issue policy advisory opinions. Key word: advisory. And while an opinion by the board that Facebook should fact-check political ads would have some weight — and could provide political cover for Facebook to reverse course, should it decide it wants to — ultimately the decision will likely still remain with Zuckerberg.
At Lawfare, Evelyn Douek zooms in on one of the more peculiar features of the board, at least at launch: it will only review cases in which an individual believes their content was removed in error. If a post was allowed to stay up in error — a piece of viral misinformation about a health crisis, for example — the board will initially have no jurisdiction. (Facebook says that it will get such jurisdiction in the future but won’t specify a time frame.) Douek writes:
Limiting the board’s jurisdiction to take-down decisions stacks the deck somewhat. It is like introducing video appeals to tennis to make calls more accurate but allowing players a review only when balls are called “out” and not when a ball is called “in,” no matter how erroneous the call seems. For those in favor of longer rallies — which probably includes the broadcasters and advertisers — this is a win, because only those rallies cut short can be appealed. For those in favor of more accurate calls generally, not so much. Indeed, on a press call, Facebook made this bias toward leaving things up explicit: The limited ambit of operations to start is “due to the way our existing content moderation system works, and in line with Facebook’s commitment to free expression” (emphasis added). Maybe so, but it is a disappointing limitation and represents an uncharacteristically incremental approach from a company famous for “moving fast.” It is important to hold Facebook to its commitment that this will be changed in the near future.
It all feels rather like … an oversight.
If Facebook expands the jurisdiction to include takedowns within the first few months of the board’s operation, I don’t think this omission is that big a deal. Much longer than that, though, and I’d say Facebook has a problem.
Then there’s the issue of how long Facebook might take to review a case. The wheels of justice aren’t known to spin particularly fast in any legitimate court, but it’s worth noting that this board has not been designed with rapid interventions in mind. Here’s Kurt Wagner at Bloomberg:
Facebook’s proposed Oversight Board, a group of people from outside the company who will determine whether controversial user posts violate the social network’s rules, could take months to make these decisions — indicating the panel won’t play a role in quickly stopping the viral spread of misinformation or abuse on the service.
Instead, the board will take on cases that “guide Facebook’s future decisions and policies,” the company wrote Tuesday in a blog post. “We expect the board to come to a case decision, and for Facebook to have acted on that decision, in approximately 90 days.” The company also said it could expedite some decisions in “exceptional circumstances,” and that those would be completed within 30 days, but could be done faster.
Now, if you are a new mom and your photo of you breastfeeding your child has been removed in error, and you appeal and the board decides to hear your case as part of a landmark decision about global nipple viewability standards, you can probably stand to wait three months to have your answer. But if you are a business whose ads have been removed because they promote products that contain CBD oil, even though CBD oil is now widely legal, that three-month delay could mean the difference between life and death for your company.
Still, it’s worth noting that even this three-month process is far superior to the current system of justice, which involves filling out a form, sending it in, and praying. (The new system will also involve filling out a form, sending it in, and praying, but there is now a chance that an independent board will ask you to make your case more formally, consult with experts, and render a binding opinion in your favor.)
I’m glad that one of the world’s largest quasi-states has evolved to include a judicial system. It’s worth noting, though, that this system has been set up explicitly to redress the complaints of individual users. It won’t be asked to “fix Facebook” broadly — to make judgments in service of the health of the overall user base, or the world they inhabit. That remains at the sole discretion of the executive — Facebook’s CEO. And at a company where the CEO has majority control over voting shares, there is effectively no legislative branch.
Facebook is taking the boldest approach we’ve seen yet to establishing an independent mechanism of accountability for itself. But as the board prepares to name its members, it’s worth keeping our expectations in line with what they’ll actually be able to do.
Coming tomorrow: Facebook earnings.

The Ratio
Today in news that could affect public perception of the big tech platforms.
🔼 Trending up: Pinterest is banning misinformation about voting and the census ahead of the 2020 election. It’s the latest example of the company taking a harder stance against potentially dangerous online content than its peers.
🔽 Trending down: Amazon’s Ring doorbell doesn’t just let people surveil their neighbors — the company also uses the app to monitor its own customers. A new investigation found the app is packed with third-party trackers that send customer data to analytics and marketing companies.
Governing
President Trump’s re-election campaign is focusing most of its efforts on digital ads designed to capture data about potential voters. The Guardian did a deep dive on how that strategy works — and why it scares Democrats. Here’s Julia Carrie Wong:
An ad that ran after the impeachment inquiry began used images of Joseph Stalin, Fidel Castro and members of the fringe group the Revolutionary Communist party burning an American flag to suggest that Democratic candidates were “destroying American values”. “Only one man can stop this chaos,” the ad announces, before Trump appears under blue skies.
But it’s not just doom, gloom and fearmongering. Many of Trump’s ads are the kind of cheerful, patriotic marketing that one would expect from a discount furniture showroom’s Fourth of July sale. These ads may not be getting under the skin of Trump’s Democratic rivals, but they appear to be helping to drive the re-election campaign’s substantial fundraising.
Facebook ads are designed to induce online actions, and almost all of the Trump campaign’s ads are clearly intended to produce one of four: donating money, attending a rally, buying campaign merchandise, or providing the campaign with a user’s email address or mobile phone number.
Republicans have been warming up to Facebook as attacks from Democrats become more frequent. The trend could be a result of the social media giant hiring more Republican-leaning lobbyists and doubling down on free speech messaging. (It also just hired a former Fox News executive to head up its news video efforts.) (Christopher Stern / The Information)
Facebook continued to run an advertisement for a conservative media company disputing the link between climate change and Australia’s bushfires for days after the false claims were debunked by the platform’s own fact-checking partner. (Cameron Wilson / BuzzFeed)
Elizabeth Warren unveiled a new plan to fight disinformation. Her goals include implementing harsher laws against spreading misinformation with the purpose of suppressing voter turnout. Would that be constitutional? (Russell Brandom / The Verge)
Clearview AI, the facial recognition company that claims to have a database of over 3 billion photos, faces calls for bans and mounting legal threats following investigations from the New York Times and BuzzFeed. (Ryan Mac, Caroline Haskins and Logan McDonald / BuzzFeed)
Anti-vaxxers are finding new audiences on Instagram by using popular tags like #maga2020 and Joe Biden’s favorite slogan, “No malarkey!” The strategy has allowed them to evade detection from Facebook, which has been trying to block anti-vaxx hashtags since last year. (David Uberti / Vice)
Industry
Facebook’s Vice President of Engineering Jay Parikh announced he is leaving the company. Parikh was considered to be instrumental in creating the data center infrastructure on which Facebook built its many apps and services. Salvador Rodriguez at CNBC explains what’s next:
David Mortenson will replace Parikh in leading Facebook’s infrastructure organization, a company spokesman told CNBC. Mortenson has been at Facebook since 2011. Parikh’s other responsibilities will likely be divided among several engineering leaders, he added. […]
“A lot of what we’ve achieved over the past eleven years just wouldn’t have been possible without you,” said Facebook CEO Mark Zuckerberg in a comment on Parikh’s post. “I don’t think we even had a data center when you joined, and now we share our designs so the rest of the world can catch up!”
The new Off-Facebook Activity tool shows users just how much information the social media giant has on them. But the volume of data makes it almost impossible to derive meaning from. (Kaitlyn Tiffany / The Atlantic)
US colleges are trying to install location tracking apps on students’ phones. The technology is supposedly going to be used to track attendance. (Sean Hollister / The Verge)
The CEO of Match Group, Mandy Ginsberg, announced she is stepping down. Match is the parent company of dating services including Hinge and Tinder. (Dan Primack / Axios)
The BBC is still struggling to connect with younger audiences. A podcast aimed at under-25s has fallen flat, and the network continues to ignore newer formats like TikTok. Never ignore TikTok! (Sarah Manavis / The Guardian)
And finally ...
As we talked about yesterday, the coronavirus epidemic has led to the spread of misinformation around the web. I don’t know why any of you smart people would turn to the pro-Trump conspiracy army of Qanon for health tips here, but just in case — absolutely do not. Here’s Will Sommer at the Daily Beast:
The substance — dubbed “Miracle Mineral Solution” or “MMS” — has long been promoted by fringe groups as a combination miracle cure and vaccine for everything from autism to cancer and HIV/AIDS.
The Food and Drug Administration has repeatedly warned consumers not to drink MMS, last year calling it effectively a “dangerous bleach” that could cause “severe vomiting” and “acute liver failure.” But those warnings haven’t stopped QAnon devotees—who believe in a world where Donald Trump is at war with shadowy deep-state “cabal”—from promoting a lethal substance as a salve for a health crisis that speaks to the darkest recesses of fringe thought.
So there you have it: don’t drink bleach. Tell all your friends!
Talk to us
Send us tips, comments, questions, and your nominations for Facebook’s legislative branch: casey@theverge.com and zoe@theverge.com.
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue