Last week, YouTube CEO Susan Wojcicki wrote a blog post
in which she talked about the platform’s commitment to leaving up controversial videos even when they are offensive. This week, the company posted a new message
about the videos they have decided to take down — and, YouTube says, it’s taking down many more videos than it ever has before.
Those numbers are approximately five times as many than the company’s last quarter, according to a new blog post from YouTube about the company’s attempts to tackle a growing number of hateful and dangerous videos on the platform. This also includes doubling the removal of comments (more than 500 million) that were found to be hateful. Some of these channels, videos, and comments are old and were terminated following the policy change, according to the blog post. This could account for the spike in removal numbers.
As usual when tech companies announce platform-scale statistics, the numbers involved can boggle the mind.
When you hear that YouTube removed 100,000 videos, does that sound like a lot, or a little?
Do the 17,000 removed channels represent the core of YouTube’s problems with extremism and hate speech, or are they the tip of the iceberg?
Does YouTube removing 500 million comments indicate a spectacular advance in moderation technology, or had it previously been ignoring a lot of low-hanging fruit?
These questions are all but impossible to answer, because as outsiders we have very little knowledge of what’s on the platform. We find out about enforcement actions only after the fact. In the meantime, we search, we see what’s trending, and we speculate — some of us in good faith, and others not.
But after a summer of cascading PR crises
, YouTube is keen to convey the sense of a steady hand at the wheel. Today’s blog post is the first of a planned four-part series on “responsibility,” which the company has divided into four (other) R’s. (Coming up after “remove” are “raise,” “reward,” and “reduce.”)
I can’t help but feel like there’s a fifth R missing from this list: “review,” maybe, or “recourse.” A common theme in YouTube’s bad summer has been that human beings caught up in the platform’s machinery often have no good mechanism for appeal.
If you’re a creator, you can file an appeal — which, for all but the largest accounts, means filling out an online form and praying. (Some LGBT creators are suing YouTube over demonetization issues, but the case seems likely to advance very far.)
At this stage in its evolution, YouTube in some ways resembles a nation-state. But it lacks one of a state’s most essential features: a legitimate justice system. There is almost no way in which, on a decision-by-decision basis, the hard-working folks at YouTube are truly accountable. And it is for that reason that, no matter how many times YouTube tells us that it has removed a large number of videos from the site, its enforcement decisions will struggle to convey a sense of their legitimacy.
Facebook’s answer to this issue is its still-forthcoming independent oversight board
, which will attempt to devolve some of its power to a group of outsiders. In recent discussions with Googlers, I’ve learned that the company is watching its rival’s effort closely — but has stopped well short of spinning up a similar program.
There are lots of reasons for Google to let Facebook take the lead here, of course. But it strikes me that before you decide to give away some of your power to outsiders, you first have to come to terms with how powerful you are. And I wonder, at this late date in 2019, whether YouTube truly has.