Content Moderation and Operations

Content moderation is the process of reviewing online user-generated content (UGC) for compliance against a digital platform’s policies regarding what is and is not allowed to be shared on their platform. The process of moderating content and enforcing policy is either done manually by people or through automation, or a combination of both, depending on the scale and maturity of the abuse and of a platform’s operations. This chapter focuses on different approaches to setting up content moderation teams, how to ensure content moderators are successful, user appeals and channels for appeals, and metrics relevant to content moderation.

In this chapter we provide an introduction to how operations work for content moderation in trust and safety, including:

  • Considerations to manage trust and safety operations, with context on how content moderation works in the background.
  • Common methods and practices followed in content moderation operations, including their advantages and disadvantages where applicable.