I recently shared two grabs (above) that show my own current moderation spectrum as well as the top “power mod” accounts who moderate the most users on Reddit. I’ve left a lot of subs lately for all the reasons mentioned above and in this (Protocol) article
reflecting why I did it. You can see I’m only currently moderating 5-10 active subs compared to hundreds I previously used moderately. The site is not secure enough to do volunteer free labour work at the expense of one’s own peace of mind and physical security. It just doesn’t add up anymore. As a moderator on Reddit, what you do goes unseen and under-appreciated by 110% of the audience. You have to clean up spam and NSFW content and borderline criminal instances of a varied spectrum of horrors that we get to clean up for no-one else to be exposed to, from our own audience to Reddit’s general one, and advertisers coming on Reddit to buy native ads or take part in the conversations.
What moderating a community should be is engaging your audience and creating events and instances around your audience’s interests. It should be a manager rather than a janitor. But, as it stands, we have to clean up mess for the most part and we rarely get to ever engage our audience directly or positively. Mod-hate on Reddit is at a record high, the most ardent it’s ever been in years, due to misinformation and astroturfing running rampant on the platform. Moderation on Reddit needs to be reassessed, to say the least.
Q You were recently referenced in an article about Reddit’s PowerMods - do you think power is concentrated in too few people’s hands?
Moderation on Reddit can be easily abused as it already has been in the past; for example, the top mods of r/natureisfuckinglit
trying to manipulate the Reddit algorithm to their benefit and get paid for certain promotions on their now-massive subreddit. They were replaced by new mods and their accounts were suspended but we don’t know exactly what went down since Reddit never addresses these actions beyond a static “thank you for reporting” message. One fix could include limiting the number of subs/users one single mod can moderate since the general audience is fearful of a monopoly of communities by a few users who could potentially be biased in their moderation against a socio-political opposition. Honestly speaking, it’s not up to us mods to fix Reddit; we’re supposed to be volunteers who are here in good faith because they love the platform and enjoy allocating personal time into these communities because they care. As it stands today, not a single co-mod can say it isn’t affecting their peace of mind and, while they might all have varied opinions as to what needs to be done to “fix” Reddit, that tells you that the current system could crumble at any time. Solutions need to be provided by Reddit’s administration and we will align with those accordingly or step down if we disagree with what they are doing (or failing to do).
Q Is it realistic in the long-term that moderating communities — on Reddit, Discord, Slack, wherever — is done by volunteers in their spare time?
In essence volunteer moderation is free labour and I have a hard time accepting that the current volunteer system on Reddit is not beyond broken. There are massive pros to volunteer moderation, for instance, it cannot (in theory) be as biased as paying a third party or internal team could be when it comes down to enforcing your ToS (terms of service) and sub-specific rules. Take Facebook for example; a lot of their bias is documented and all of it comes from specifically paid moderation that can be asked to enforce or ignore certain instances depending on whether or not they align with the company’s supposed vision. Reddit is a curation platform, it’s good to have volunteer moderation on one of the most engaged platforms worldwide, but the current way it’s set up leaves both the mods and the community exposed to a lot of cracks. All it takes is the administration revisiting it and updating it according to today’s site-wide struggles.