Content Moderation Overview

Published: March 5, 2022

Updated: March 5, 2022

By Rick Lamb in Community

Content Moderation 101: How to Keep Your Community Healthy.

What is content moderation?

Content moderation is the filtering of online content to prevent the appearance of unwanted material on a platform.

There will be some form of monitoring and analysis applied to users’ posts, comments, video uploads, and other user-generated content. Then a determination has to be made whether it follows the content moderation guidelines of the platform.

That might be due to being spam, NSFW content, or an actual violation of a country’s laws such as a copyright problem.

The meaning of content moderation can be extremely broad and must be seen in context. Content moderation on a car enthusiast forum is much more straightforward than on Twitter’s hugely popular accounts of famous people.

What does a content moderator do?

A content moderator is a person who is responsible for applying a platform’s rules, removing posts, suspending and ultimately banning users.

In addition, a content moderator job description might also include commenting on posts to ask for additional information or changes to an existing post rather than just removing it. For instance, a comment asking for steps to reproduce an issue that a user is having.

Content moderation jobs are becoming increasingly common. Often with the benefit of flexible remote work for the moderator. But if it is not obvious, this does mean that a human content moderator will see all unmoderated content! Mental health is a serious issue for people considering this kind of job.

Shocked Content Moderator

Why is content moderation important?

Content moderation is important to ensure the experience of all users is a positive one. Nobody likes to wade through spam to find the material they actually want.

The importance of content moderation is often realized too late in many places. A forum may be overrun with self-promotion before a drop in the happiness of real users is noticed in analytics.

The primary disadvantages of content moderation are the time and expense required to do it.

In a nutshell, content moderation is important to:

  1. Enforce Guidelines
  2. Prevent SPAM
  3. Protect Brand Reputation

Motorbike Cop

The 4 basic types of content moderation

Pre-Approval

Particularly for anonymous or new members who have no posting history it is good to have someone approve a post manually before it is shown to the rest of the community. The problem is if this takes too long it is very hard to have conversational content with a lot of back and forth between users.

Review after posting

Alternatively, each post may be immediately visible and put in a moderation queue. So a moderator is notified of new posts and looks at them afterward hoping to take them down fairly quickly if there is an issue. This works well for removing copyright violations and some manual spam. But less well for offensive content or automated spam.

Reactive Moderation

Another option is to rely on users flagging problematic content themselves. Which then raises it with an official content moderator. This might be seen as a form of distributed moderation. Spreading the workload away from volunteer moderators. A common tactic nowadays is to hide a post if it gets several downvotes.

Automated Moderation

For particularly active sites it may eventually become necessary to employ tools that automatically scan content as it is added. Tools commonly scan for NSFW words or images, more advanced tools may be able to identify spammy and duplicate content. Unfortunately, tools aren’t perfect and some will get through, but it can again at least reduce the workload on the human moderators.

Types of Content that needs moderating

Content that needs moderating is not just limited to words and pictures. Pretty much anything that a user has input over will need a content moderator involved somewhere. Even when a machine generates the content at random, users have been known to flag and upvote things that resemble inappropriate items. Or another classic example is the Google AI program that having learned from online content itself, often generates overtly racist passages of text.

Here’s a list of all the different types of content you might find on a site that will need some form of quality control.

  1. Text
  2. Images
  3. Video
  4. Audio
  5. 3D Models
  6. Auto-generated content
  7. Links out to other sites

Content Moderation Examples

Examples of the types of things that moderators need to deal with vary widely. Racist hate speech is a simple example that is easy to identify as a problem. A different example might be pictures of food on a diet forum. Also, most sites have to deal with the advertising of various adult services and similar spam.

Content Moderation Services

Online moderation services have become such a common need that many companies have sprung up selling content moderation solutions of various kinds. That might be in the form of human moderators, automated tools, or some hybrid of both. The gold standard would be to employ your own full-time staff that just focus on your site and can know your rules and expectations inside out. But that comes at a substantial price…

What is the average content moderation salary?

For a full-time US-based content moderator the average salary is $38,542 per year Glassdoor.

But this figure varies very widely based on company and exact location.

For content moderation with a human face these companies provide solutions:

  1. Pure Moderation
  2. CrowdSource
  3. Sutherland Global

Content Moderation Tools

Alternatively, here are some companies that focus more on AI-powered tools for content moderation.

  1. TwoHat
  2. Chekkee
  3. Hive Moderation