Insights

When to Moderate Content and When Not to

4 min read Michael Carroll on Dec 9, 2022
Moderate-Content-Blog-Image-(1380x600).png

With the massive amounts of information published daily, it opens up the door for inappropriate content, misinformation, and trolls in your app. In order to ensure that online communities are safe, companies should have a strategy for identifying this harmful content and a process for how to handle it. 

Below we cover the types of content moderation and what’s the best way to handle content moderation to ensure you are delivering a great user experience.

What is content moderation and why is it important?

Content Moderation is the process of reviewing and filtering content such as text, images, audio, and video to create a safe and inclusive experience. It helps ensure that the content that is being published on your platform is appropriate for your audience, and also aligned with your overall business goals. 

Types of content moderation 

Before incorporating content moderation in your platform, you’ll first need to understand the different types of content moderation capabilities that you can choose from.    

1. Automated Moderation: Automated moderation is defined as content that is moderated by artificial intelligence (AI) algorithms. This type of moderation helps detect offensive content quickly based on a platform's specific rules and criteria. 

2. Pre-Moderation: Pre-moderation is when content is reviewed and assessed before it goes live to ensure that it meets the predetermined community guidelines. This helps ensure harmful content is blocked before reaching your users.

3. Post-Moderation: Post-moderation occurs once the piece of content goes live. Users are able to post content, but once published it will be reviewed by either human moderators, a moderation team, or a moderation system and can be flagged and removed if deemed inappropriate. 

4. Reactive Moderation: Reactive moderation is a method of moderation that relies on end users to report content that is offensive or goes against community guidelines. 

5. Distributed Moderation: Distributed moderation refers to when the decision to remove online content is made by the community members. 

Content moderation on social media platforms 

With the rise of social media companies like Facebook, Twitter, TikTok, and YouTube over the last decade, disinformation has become more prevalent. Because of this, social media platforms have created community guidelines to address such content (misinformation and hate speech) and if these sets of rules are violated, they take down posts and often ban users. However, because of the large amounts of user-generated content that’s being created every day, it can be strenuous to stay on top of hate speech, inappropriate images, or videos.  

User-generated content (UGC) refers to any form of content, such as text, images, audio, live stream video content that has been posted to an online platform or a social media platform. This is why a content moderation solution is important to prevent harmful UGC from being associated with a company’s brand reputation. 

What to consider before choosing a content moderation solution

Between sifting out offensive content and misinformation, while also protecting free speech, it can be difficult for online platforms to enforce a content moderation process. Do you only remove user-generated content (UGC) when it violates the laws of the country or region in which your organization is headquartered? Or do you make your own set of rules and guidelines for which types of content is appropriate for your platform?  

To decide a moderation method for your platform, consider the following: 

Best practices for content moderation

Determine your target audience 

You’ll want to decide what content should be allowed on your platform based on what aligns with your brand. Whether you are a gaming company where users of all ages are interacting within your platform or a dating app looking to maintain a safe environment, identifying your target audience, their demographics, and specific needs will help you come up with a strategy for which content is appropriate. 

Create guidelines that cover all languages and types of content

Impose and document a clear set of rules and expectations for what is allowed on your platform. Including positive and negative examples in these guidelines is another way to clarify which types of user behavior is not acceptable. When creating these guidelines, be sure to take into consideration laws governing communication based on your organization's region and where most of your user base is located. 

Establish protocols 

After setting these clear guidelines, establish protocols for how your moderation team or moderation system should take action when rules are broken. This can involve:

  • Escalating content editing to human moderators for review

  • Content removal

  • Suspending or blocking a user 

Use a customizable solution 

Ensure that your app has the proper moderation capabilities in place so that you are not restricting things that may not need moderation, such as sarcasm or memes. Leveraging a customizable moderation solution allows you to build without sacrificing developer time and resources that could be focused elsewhere. A reliable solution equips you with the functionality to enforce community guidelines, filter profanity, ban users, and more in real time.

How to moderate content with PubNub

Content moderation presents its fair share of challenges, but with the help of PubNub you don’t have to go it alone. Our customizable Moderation Dashboard works in real time to ensure effective filtering and can scale to keep up with volume regardless of how many users are using your platform. 

The Moderation Dashboard is loaded with features, such as profanity filtering and channel moderation, enabling developers to moderate user activity, enforce community guidelines, and protect your brand reputation.

Using the Moderation Dashboard, you can perform common moderation tasks such as:

  • Automatic Message Moderation - For moderating text and images. 

  • Administrator Message Moderation - For after-the-fact editing and removal of messages.

  • Administrator User Management - Ban users when needed.

Ready to add content moderation capabilities to your app? Explore our Moderation Dashboard quick start guide and sign up for a free account today to get started!