i. Community guidelines, content policies, moderation rules: the names that digital services use for their standards are many and varied. And so is the way that these policies are applied around the world.
We see this with Facebook in Ethiopia, where Vice News reported this week
that a failure to stop and remove genocidal attacks on the platform has led directly to political violence and killings. Facebook’s community standards
, I found out in the piece, are not available in Amharic and Oromo (Ethiopia’s two main languages) and there are only 100 outsourced staff assigned to the whole continent
. A similarly bleak report
came out of Sudan recently, where one activist has untold issues getting Instagram to remove pictures of women taken without their permission.
This two-tier enforcement system is creating similar conditions to that which we saw in Myanmar, where hate spread and violence erupted against Rohingya Muslims. All sadly familiar.
ii. Not all policies are intended for the public en masse; some are about marshalling discussion among staff. So it was interesting to see that both Alphabet-owned Google and Facebook issued clarifications this week to the moderation guidelines on its internal communication tools.
The increase has been put down to the rise of working remotely and the active discussion of emotionally-charged topics. I wonder how many other companies are seeing — or would admit to — a similar trend.