View profile

Why the News Feed should explain itself

Revue
 
One reason Facebook struggles to earn our trust is because at the individual level, no one at the com
 
January 26 · Issue #70 · View online
The Interface
One reason Facebook struggles to earn our trust is because at the individual level, no one at the company can tell us why we’re seeing what we’re seeing in the News Feed. It can talk about the content of the feed in general terms — mostly posts from friends and family, ranked by how close Facebook believes you to be with them — but were an engineer to browse your feed alongside you, they couldn’t explain why the posts appeared in the exact order they did.
A few years ago I was interviewing Chris Cox, who leads product across the company, and asked something about my feed I had always wanted to know. Sometimes I would open Facebook after being away for an hour or so and the News Feed would show me one or two posts I had already seen. Was that an effort to get me to add a comment? Did Facebook think I’d be more likely to share something after I saw it a second time? No, Cox said. That was just a bug.
The conversation stuck with me for two reasons. One, we talk about Facebook primarily in the context of its power, and the bug was a good reminder that the News Feed is just a flawed piece of software like any other. Two, it was one of the only times I could remember hearing something definitive about the content of my own News Feed. 
I thought of that conversation again this week while reading the venture capitalist Fred Wilson’s post about “explainability.” Wilson starts seeing a bunch of items about Kendrick Lamar in the feed of content that appears underneath the Google search bar, and wonders why. 
That leads him to an AI startup named Bonsai, which attempts to build systems that can ultimately explain their decisions to users. Bonsai writes:
Explainability is about trust. It’s important to know why our self-driving car decided to slam on the breaks, or maybe in the future why the IRS auto-audit bots decide it’s your turn. Good or bad decision, it’s important to have visibility into how they were made, so that we can bring the human expectation more in line with how the algorithm actually behaves. 
Wilson thinks about how this might ultimately manifest itself in a consumer product:
What I want on my phone, on my computer, in Alexa, and everywhere that machine learning touches me, is a “why” button I can push (or speak) to know why I got that recommendation. I want to know what source data was used to make the recommendation, and I’d also like to know what algorithms were used to produce confidence in it.
It’s time to start a conversation about explainability at Facebook. Why did that highly partisan article appear in your News Feed? Why do you see every post about breakfast from a random acquaintance but not the new baby of your college roommate? Why am I seeing this ad in my feed, just minutes after I had a conversation about it in real life with a friend? 
Answering the “why” question would be an enormous technical challenge for Facebook. But solving it could go a long way in establishing trust with users. As the company continues to beat the drum about its work in artificial intelligence, explainability should be an important part of the conversation. 

Democracy
Robert Mueller's Russia Investigation Includes at Least One Facebook Employee Interview
This Is Who's Hacking Right Wing Twitter
Google is testing Bulletin, an app that would let anyone publish a news story.
Is There Something Wrong With Democracy?
Elsewhere
The Dirty War Over Diversity Inside Google
These publications have the most to lose from Facebook’s new algorithm changes
Launches
Facebook launches new program to lure Twitch-style game streamers
Introducing The Facebook Journalism Project Scholarship
Takes
Tech chief compares Facebook to cigarettes, urges government regulation
Jonah Peretti: Everything is fine
Panicked about Kids’ Addiction to Tech?
Facebook’s trust survey isn’t too short — but it is written badly
And finally ...
The best memes are nonsense and I love ‘karma is a bitch’
Talk to me
Questions? Comments? Explanations of your behavior? casey@theverge.com 
Did you enjoy this issue?
Thumbs up 1ae5a7bdfcd3220e2b376aa0c1607bc5edaba758e5dd83b482d03965219a220b Thumbs down e13779fa29e2935b47488fb8f82977fedcf689a0cc0cc3c19fa3c6bb14d1493b
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue