View profile

An ex-YouTube engineer has a warning for democracy

Revue
 
This summer, I wrote a cheerful piece called How YouTube perfected the feed. In the dark months follo
 
February 2 · Issue #75 · View online
The Interface
This summer, I wrote a cheerful piece called How YouTube perfected the feed. In the dark months following the 2016 election, I found myself spending ever more time on the site, guided by its uncannily good recommendations. YouTube has always been a place for me to explore  my more visual niche interests: cooking, video games, pro wrestling, and live music, among others. I reached out to YouTube with some questions, and spent a day on campus talking with the people responsible for making its recommendations so precisely tuned to my interests.
What didn’t occur to me then, and should have, is how the same machine-learning technology that enables those pitch-perfect recommendations could be gamed to push a more toxic agenda. In a must-read report in the Guardian today, Paul Lewis introduces us to Guillaume Chaslot, a former YouTube engineer who worked on those recommendations — and now says they promote sensationalist, false, and conservative-leaning videos at the expense of more moderate ones.
During the three years he worked at Google, he was placed for several months with a team of YouTube engineers working on the recommendation system. The experience led him to conclude that the priorities YouTube gives its algorithms are dangerously skewed.
“YouTube is something that looks like reality, but it is distorted to make you spend more time online,” he tells me when we meet in Berkeley, California. “The recommendation algorithm is not optimising for what is truthful, or balanced, or healthy for democracy.”
Chaslot explains that the algorithm never stays the same. It is constantly changing the weight it gives to different signals: the viewing patterns of a user, for example, or the length of time a video is watched before someone clicks away.
Chaslot built a website, AlgoTransparency, that tracks top YouTube recommendations across a variety of controversial categories. Here’s how it works:
It finds videos through a word search, selecting a “seed” video to begin with, and recording several layers of videos that YouTube recommends in the “up next” column. It does so with no viewing history, ensuring the videos being detected are YouTube’s generic recommendations, rather than videos personalised to a user. And it repeats the process thousands of times, accumulating layers of data about YouTube recommendations to build up a picture of the algorithm’s preferences.
The results will be familiar to anyone who has noted the regular appearance of conspiracy theories in Facebook’s recommended news articles:
When his program found a seed video by searching the query “who is Michelle Obama?” and then followed the chain of “up next” suggestions, for example, most of the recommended videos said she “is a man”. More than 80% of the YouTube-recommended videos about the pope detected by his program described the Catholic leader as “evil”, “satanic”, or “the anti-Christ”. There were literally millions of videos uploaded to YouTube to satiate the algorithm’s appetite for content claiming the earth is flat. “On YouTube, fiction is outperforming reality,” Chaslot says.
He believes one of the most shocking examples was detected by his program in the run-up to the 2016 presidential election. As he observed in a short, largely unnoticed blogpost published after Donald Trump was elected, the impact of YouTube’s recommendation algorithm was not neutral during the presidential race: it was pushing videos that were, in the main, helpful to Trump and damaging to Hillary Clinton. “It was strange,” he explains to me. “Wherever you started, whether it was from a Trump search or a Clinton search, the recommendation algorithm was much more likely to push you in a pro-Trump direction.”
YouTube disputes Chaslot’s methodology, and notes that it has added new signals to its ranking algorithms since Chaslot left the company. Among them is “satisfaction” — a measure of whether YouTube viewers enjoyed the video they just watched. To the extent that the algorithm leans on user satisfaction, it’s one more place where a tech platform telling us what we want to hear pushes us in ever-more polarized directions.
Still, this is another case where explainability would help build trust in the platform. Why does YouTube show you the recommendations it does? As we learn more about the consequences of its algorithm, the question feels increasingly urgent.  

Democracy
YouTube Takes Aim at Conspiracies, Propaganda
YouTube working on stricter policies to punish creators who do ‘significant harm’ to the community
Big tech’s bid to control FOIA
One in Four Americans Believe Facebook is Having a Negative Impact on Society
Facebook strikes back against the group sabotaging Black Panther’s Rotten Tomatoes rating
Elsewhere
HQ Trivia is raising $15 million at a valuation of more than $100 million from Founders Fund
More than half of Facebook Instant Articles partners may have abandoned it
Understanding Ugandan Knuckles in a post-Pepe the Frog world
The New Dating Requirement: Consuming All of Your Partner’s #Content
WhatsApp is now Facebook’s second-biggest property, followed by Messenger and Instagram
Logan Paul tells Good Morning America his controversial video was intended to show the ‘harsh realities’ of suicide
Meet the fiercely loyal middle-aged Logan Paul fans
Launches
Snapchat is now selling merchandise in its app
Spotify Now Displays Songwriter Credits
Takes
Tobacco, Tech and Ulterior Motives
And finally ...
A Brief History of ‘Jeopardy!’ Contestants Knowing Nothing About Sports
Talk to me
Questions? Comments? YouTube recommendations? casey@theverge.com 
Did you enjoy this issue?
Thumbs up 1ae5a7bdfcd3220e2b376aa0c1607bc5edaba758e5dd83b482d03965219a220b Thumbs down e13779fa29e2935b47488fb8f82977fedcf689a0cc0cc3c19fa3c6bb14d1493b
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue