View profile

🆕 Chinese rules, applied globally

Everything in Moderation
🆕 Chinese rules, applied globally
Hello everyone. It’s Friday (great) and I’m arriving late into your inboxes (not so great).
Each week, I try to flag unmissable stuff up top (often podcasts, occasionally videos) and this week I want to share this reading list compiled by the Social Media Collective (a network of moderation rockstar academics). It’s extensive, almost overwhelming, but hugely useful. Bookmark it now.
Thanks for reading,

The ugly side of TikTok
Establishing a sensible, coherent approach to content moderation on the web was always going to be tough. But it’s an altogether different proposition when you factor in TikTok.
This hit home after reading recent scoops from the Guardian’s Alex Hern. If you didn’t catch it, the Chinese-owned social network routinely downranks and deletes videos featuring political content, such as Tiananmen Square, and pro-LGBT content, even in countries where being gay is not illegal (eg Turkey). It’s essentially applying Chinese standards globally. 
TikTok insisted in a statement that it retired these guidelines in May (perhaps on the advice of its newly in post Director of Global Public Policy) but it is anyone’s guess what rules are being applied to content on the platform in any given country. 
The company, headquartered in Beijing, also does not have a good record of being particularly transparent or of responding to criticism well (see the case of YouTuber PaymoneyWubby). That’s unlikely to change anytime soon.
And so, just as one social media giant starts to open itself up to scrutiny, another one emerges with politicised policies and opaque processes but with no news media to hold it account or force it to change.
When it comes to TikTok, it’s hard to be hopeful.
It’s all semantics
Benedict Evans
The problem with content moderation is that it’s easy to say ‘take bad stuff down’ but we don’t really have a consensus on what ‘take it down’ means, nor on what ‘bad stuff’ means, nor on who should decide that.
In a Twitter thread, Andreessen Horowitz’s Benedict Evans suggests good moderation relies on good definitions. And definitions are hard.
Not forgetting...
I came close to writing about this story this week but sometimes there’s only so much Facebook one can stand. Even so, Big Blue’s former head of content standards railing against the recent politicians/newsworthiness ruling is worth a read.
The Guy Who Wrote Facebook's Content Rules Says Its Politician Hate Speech Exemption Is 'Cowardice' | WIRED
An in-depth read via Vice on how Facebook doubled down on intercepting terrorist content on its platform after the Christchurch attack.
Facebook Went to War Against White Supremacist Terror After Christchurch. Will It Work? - VICE
Further to last week’s EiM, a female Twitch user lost her stall at Twitchcon because some dudes maliciously flagged it as sexually explicit activity.
On Twitch, women who stream say their biggest obstacle is harassment
Folks, you really know what’s what: Twitch’s CEO believes content moderation is ‘the issue of our time’. (Has anyone got his email addy? I think he’d like EiM)
Twitch's Emmett Shear on Streaming Talent Wars, Moderation Plans | Hollywood Reporter
This is kinda fun. Cyber lawyer Daphne Keller had her husband flag a picture that she posted on Facebook of a naked Rodin statue. You can guess what happened.
That time my husband reported me to the Facebook police: a case study / Boing Boing
Reddit’s COO Jen Wong talks to Yahoo about the company’s ‘layered approach’ to moderation 
How Reddit avoids content moderation woes of Facebook, Twitter and YouTube
If you like what you’re reading, why not get me a Ko-Fi :) Thanks for reading! - @benwhitelaw
Did you enjoy this issue?
Ben from Everything in Moderation

A weekly newsletter about the policies, products, platforms and people shaping content moderation, now and in the future.

In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue
United Kingdom