View profile

YouTube gives out a free pass to harass

Revue
 
Last summer, during tech platforms' long period of indecision over what to do about Alex Jones, I won
 
June 5 · Issue #338 · View online
The Interface
Last summer, during tech platforms’ long period of indecision over what to do about Alex Jones, I wondered why their community standards so often seemed to favor the bully over his victims. Algorithms built by YouTube, Facebook, and Twitter all worked to recruit new followers for Jones, but when some of those followers then forced parents of the victims of the Sandy Hook Elementary shooting to go into hiding, the platforms offered no support.
That’s why I was heartened on Tuesday to learn that YouTube is changing its policies to make victims of Sandy Hook, 9/11, and other historical tragedies a protected class. That means YouTubers will no longer be able to upload videos denying that these historical events happened. That reduces the likelihood that followers of conspiracy peddlers like Jones will be recruited into his hoax and begin harassing victims.
The move is part of a broader expansion of the service’s hate speech policies, which will also prohibit videos promoting Naziism and other supremacist ideologies for the first time. It will likely result in the removal of thousands of channels — some of which are trying to document white supremacy for journalistic or academic purposes — and I wrote about it in some detail today for The Verge.
But if you’ve been following platform news today, YouTube’s expanded hate speech policy isn’t likely the news you’ve been reading about. Rather, you’ve likely been reading about Steven Crowder and Carlos Maza — and YouTube’s clumsy efforts to address a controversy that calls into question its commitment to enforcing its policies against harassment.
(If you’re familiar with the rough timeline here, feel free to skip ahead. But I do want to walk through the twists and turns here, because collectively they illustrate the difficulty even the most venerable social networks have in creating and enforcing their community standards.)
At issue is a series of response videos that Crowder, a conservative pundit, has made to Maza’s popular YouTube series “Strikethrough.” Maza’s videos often analyze cases of journalistic malpractice among conservative cable news hosts, and for the past two years Crowder has responded with a series of racial and homophobic slurs. (Until today he also sold “Socialism is for F*gs” T-shirts on a page linked to his YouTube channel, in which the vowel is replaced with an image of a fig that isn’t fooling anyone.)
Maza created a supercut of the harassment he has received from Crowder, drawing mainstream media attention. On Tuesday night, after investigating Maza’s claims, YouTube posted a curt four-part reply on Twitter saying that “while we found language that was clearly hurtful, the videos as posted don’t violate our policies.”
Outrage followed, in part because YouTube did not present a public rationale for its decision. It did, however provide one on background to journalists, which Gizmodo helpfully published. The gist: while Crowder did use offensive language, the overall thrust of his videos was to respond to Maza’s media criticism, and so YouTube considered it fair game. To YouTube, this appeared to be a case of creators badmouthing one another, which happens all the time. That Maza has been doxxed and threatened appears not to have figured into the discussion.
On Wednesday, everything got a little weirder. In a set of follow-up tweets, YouTube told Maza that it had demonetized Crowder’s channel, pending the removal of a link to his anti-gay T-shirt. This led to more outrage — the homophobia is fine, but the merch sales are a step too far? — until YouTube posted another follow-up, saying Crowder would have to address other unspecified “issues” with his account to get it back in good standing.
All of which offers little consolation to Maza. But if you squint — and are willing to give a monolithic corporation the benefit of the doubt, in a spirit of extreme generosity — you can intuit the existence of a group of frustrated YouTube employees gradually pushing their employer to do the slightly less bad thing. In fact, Caroline O'Donovan reported in BuzzFeed, more than 100 Googlers have signed a petition demanding that the company remove Pride branding from its social media accounts. (June, after all, is Pride month.)
A lot of chatter on Twitter (and in Vox Media Slack rooms) today has argued that this is an easy call. A schoolyard bully has been calling a guy names for years, on a platform that purports to ban “content that makes hurtful and negative personal comments/videos about another person.” If repeatedly calling someone a “lispy queer” is not a hurtful and negative personal comment, what is?
At the same time, it’s clear that in this case at least, YouTube does not believe it can enforce that particular rule against this particular creator. So what else might the company do?
That brings me back to Alex Jones.
Jones, like Crowder, was typically careful not to make explicit calls for his followers to harass people. But they did so anyway, and the result — that parents of murdered children can no longer regularly visit their child’s grave — shocked our conscience.
Two, the effect of Jones’ harassment grew as his channel grew — something YouTube helped him do by recruiting new subscribers for him through recommendations. As Jones’ base grew into the millions, his videos carried with them an ever-growing risk of inspiring real-world violence.
Three, YouTube (and other platforms) made inconsistent, sometimes baffling statements about Jones before banning him. It left open the question of precisely when YouTube would enforce its standards against harassment, in ways that continue to haunt the service today.
So, here’s what I’d like to see YouTube do.
One, the definition of incitement should change to hold creators accountable when they dog whistle at their followers in a way that inspires harassment. Crowder has gone into overdrive making videos about his impending ban from YouTube at Maza’s hands, warning his millions of subscribers that corporate media is about to crush them and stifle all free speech. It’s no wonder that harassment of Maza has only increased since then, proliferating across social networks.
But imagine if YouTube held creators accountable for what their followers did, whatever borderline language they used in an effort to get them to do it. Telling a creator that you’re suspending them because their followers harassed and doxxed someone they made a video about might encourage better creator behavior all around. Two things I like about this approach: one, it emphasizes getting the outcomes right, rather than the rules. And two, outcomes are generally easier to evaluate than intent — you either were doxxed, or you weren’t.
Two, YouTube should explicitly apply different standards to creators with millions of subscribers. A random vlogger with 100 followers railing against a corporate YouTube channel is barely of passing interest to me. But a creator with millions of followers has great power — and Spider-Man taught us long ago what ought to come with that. YouTube already holds its biggest creators to a different standard with its partner program, which describes the (opaque, byzantine, ever-changing) rules by which they are allowed to earn revenue. I see no reason why stars with enormous followings shouldn’t be held to a higher standard than the creative proletariat.
Finally, YouTube and its top executives need to start having all of these arguments with us in public. At the moment, it is a company terrified of its own user base — the company’s blog post today announcing a hate speech ban was unsigned, and reporters were asked to withhold the names of the executives they interviewed about it.
I fully support this, from a security standpoint. But it has proven awful from a communications standpoint. The company’s terse tweeted responses to impassioned pleas from its creators have left everyone mystified about what, if anything, the company stands for, aside from minimizing bad press. At least the background statement it provided to Gizmodo about Crowder offered something resembling a rationale. Here’s hoping the company starts putting those background statements on the record — because until YouTube starts standing behind its decisions with arguments rather than platitudes, we are doomed to spend eternity talking past each other.
I’m under no illusion that these policy changes would eliminate YouTube’s harassment policy. But I do think they would go a long way toward rooting it in durable principles. And principles are something that YouTube lately has had far too much trouble demonstrating.

Pushback
I made a dumb mistake in the email version of yesterday’s newsletter, in which I said that “Sign in with Apple” is mandatory for all developers. I should have added that it’s mandatory for all developers who enable third-party logins. So if you offer your own login tool but not a similar one from Google, Facebook, Twitter, or Snap, you’re in the clear.
But if you do enable third party logins, not only do you have to include Apple’s, but you have to put the Apple button on top. Hilarious detail from Stephen Nellis that broke after I put the newsletter to bed Tuesday:
The move to give Apple prime placement is significant because users often select the default or top option on apps. And Apple will require apps to offer its button if they want to offer options to login with Facebook or Google. […]
Apple’s suggestion to developers to place its login button above rival buttons is part of its “Human Interface Guidelines,” which are not formal requirements to pass App Store review. But many developers believe that following them is the surest way to gain approval.
Democracy
Why Washington seems ready to regulate Amazon, Google, and other Big Tech companies
Amazon may soon face an antitrust probe. Here are 3 questions the FTC is asking about it.
Antitrust Reviews May Force Big Tech to Rethink ($)
AI experts draft algorithmic bill of rights to protect us from Big Tech
Cops Across The US Have Been Exposed Posting Racist And Violent Things On Facebook. Here's The Proof.
Elsewhere
Tesla Is Blocking Its Employees From Accessing an Anonymous Social Network for Workplace Complaints
Google and Facebook sucked profits from newspapers. Publishers are finally resisting.
Why Strangers Are AirDropping You Memes and Photos
Launches
Skype screen sharing comes to Android and iOS | VentureBeat
Supporting Subscriptions-Based News Publishers
Takes
Three threads worth reading today. Click to expand!
Julian Sanchez
A few thoughts (THREAD INCOMING) on the Maza/Crowder/YouTube debate: I think this ends up being tricky for companies in large part because they aspire to uniform rules, but the harms of abusive speech in a social network ecology vary wildly depending on the speaker’s audience. https://t.co/QXV74eMaBa
10:17 AM - 5 Jun 2019
Alex Stamos
The core of YouTube's problem is that they are acting in a quasi-governmental manner, regulating the speech of hundreds of millions without the trappings of transparency and precedent we've reasonably come to expect with such power. https://t.co/U1L69lBUh6
2:37 PM - 5 Jun 2019
sarah jeong
i honestly do not think anyone really knows how youtube should be run, but also that Actual Youtube doesn't crack into the top 10 who come close to knowing what the right thing to do is https://t.co/QIUIdliSLK
1:06 PM - 5 Jun 2019
And finally ...
Facebook's new NYC building is apparently called Penn15
Talk to me
Send me tips, comments, questions, and a cleverly worded dog whistle that will offend me without endangering monetization on your YouTube channel: casey@theverge.com.
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue