Last summer, during tech platforms’ long period of indecision over what to do about Alex Jones, I wondered why their community standards
so often seemed to favor the bully over his victims. Algorithms built by YouTube, Facebook, and Twitter all worked to recruit new followers for Jones, but when some of those followers then forced parents of the victims of the Sandy Hook Elementary shooting to
go into hiding, the platforms offered no support.
That’s why I was heartened on Tuesday to learn that YouTube is changing its policies to make victims of Sandy Hook, 9/11, and other historical tragedies
a protected class. That means YouTubers will no longer be able to upload videos denying that these historical events happened. That reduces the likelihood that followers of conspiracy peddlers like Jones will be recruited into his hoax and begin harassing victims.
The move is part of a broader expansion of the service’s hate speech policies, which will also prohibit videos promoting Naziism and other supremacist ideologies for the first time. It will likely result in the removal of thousands of channels — some of which are trying to document white supremacy for journalistic
or academic purposes — and I
wrote about it in some detail today for The Verge.
But if you’ve been following platform news today, YouTube’s expanded hate speech policy isn’t likely the news you’ve been reading about. Rather, you’ve likely been reading about
Steven Crowder and Carlos Maza — and YouTube’s clumsy efforts to address a controversy that calls into question its commitment to enforcing its policies against harassment.
(If you’re familiar with the rough timeline here, feel free to skip ahead. But I do want to walk through the twists and turns here, because collectively they illustrate the difficulty even the most venerable social networks have in creating and enforcing their community standards.)
At issue is a series of response videos that Crowder, a conservative pundit, has made to Maza’s popular YouTube series “Strikethrough.” Maza’s videos often analyze cases of journalistic malpractice among conservative cable news hosts, and for the past two years Crowder has responded with a series of racial and homophobic slurs. (Until today he also sold “
Socialism is for F*gs” T-shirts on a page linked to his YouTube channel, in which the vowel is replaced with an image of a fig that isn’t fooling anyone.)
Outrage followed, in part because YouTube did not present a public rationale for its decision. It did, however provide one on background to journalists, which
Gizmodo helpfully published. The gist: while Crowder did use offensive language, the overall thrust of his videos was to respond to Maza’s media criticism, and so YouTube considered it fair game. To YouTube, this appeared to be a case of creators badmouthing one another, which happens all the time. That Maza has been doxxed and threatened appears not to have figured into the discussion.
All of which offers little consolation to Maza. But if you squint — and are willing to give a monolithic corporation the benefit of the doubt, in a spirit of extreme generosity — you can intuit the existence of a group of frustrated YouTube employees gradually pushing their employer to do the slightly less bad thing. In fact, Caroline O'Donovan reported in
BuzzFeed, more than 100 Googlers have signed a petition
demanding that the company remove Pride branding from its social media accounts. (June, after all, is Pride month.)
At the same time, it’s clear that in this case at least, YouTube does not believe it can enforce that particular rule against this particular creator. So what else might the company do?
That brings me back to Alex Jones.
Jones, like Crowder, was typically careful not to make explicit calls for his followers to harass people. But they did so anyway, and the result — that parents of murdered children can no longer regularly visit their child’s grave — shocked our conscience.
Two, the effect of Jones’ harassment grew as his channel grew — something YouTube helped him do by recruiting new subscribers for him through recommendations. As Jones’ base grew into the millions, his videos carried with them an ever-growing risk of inspiring real-world violence.
Three, YouTube (and other platforms) made
inconsistent, sometimes baffling statements about Jones before
banning him. It left open the question of precisely when YouTube would enforce its standards against harassment, in ways that continue to haunt the service today.
So, here’s what I’d like to see YouTube do.
One, the definition of incitement should change to hold creators accountable when they dog whistle at their followers in a way that inspires harassment. Crowder has gone into overdrive making videos about his impending ban from YouTube at Maza’s hands, warning his millions of subscribers that corporate media is about to crush them and stifle all free speech. It’s no wonder that harassment of Maza has only increased since then, proliferating across social networks.
But imagine if YouTube held creators accountable for what their followers did, whatever borderline language they used in an effort to get them to do it. Telling a creator that you’re suspending them because their followers harassed and doxxed someone they made a video about might encourage better creator behavior all around. Two things I like about this approach: one, it emphasizes getting the outcomes right, rather than the rules. And two, outcomes are generally easier to evaluate than intent — you either were doxxed, or you weren’t.
Two, YouTube should explicitly apply different standards to creators with millions of subscribers. A random vlogger with 100 followers railing against a corporate YouTube channel is barely of passing interest to me. But a creator with millions of followers has great power — and Spider-Man taught us long ago what ought to come with that. YouTube already holds its biggest creators to a different standard with its
partner program, which describes the (opaque, byzantine, ever-changing) rules by which they are allowed to earn revenue. I see no reason why stars with enormous followings shouldn’t be held to a higher standard than the creative proletariat.
Finally, YouTube and its top executives need to start having all of these arguments with us in public. At the moment, it is a company terrified of its own user base — the company’s blog post today announcing a hate speech ban was unsigned, and reporters were asked to withhold the names of the executives they interviewed about it.
I fully support this, from a security standpoint. But it has proven awful from a communications standpoint. The company’s terse tweeted responses to impassioned pleas from its creators have left everyone mystified about what, if anything, the company stands for, aside from minimizing bad press. At least the background statement it provided to Gizmodo about Crowder offered something resembling a rationale. Here’s hoping the company starts putting those background statements on the record — because until YouTube starts standing behind its decisions with arguments rather than platitudes, we are doomed to spend eternity talking past each other.
I’m under no illusion that these policy changes would eliminate YouTube’s harassment policy. But I do think they would go a long way toward rooting it in durable principles. And principles are something that YouTube lately has had far too much trouble demonstrating.