Case Study: Fandom achieves 50% cost savings in content moderation with Coactive AI

A Coactive AI Case Study

Preview of the Fandom Case Study

Fandom Improves Community Safety and Reduces Cost by 50% with Coactive AI

Fandom, the world's largest fan platform, faced a significant challenge in moderating the vast amount of images uploaded by its 350 million monthly users. While only 0.5% of their 2.2 million monthly uploads were malicious, the scale required a costly manual review process that took contractors 500 hours per week. This slow system left harmful content visible for up to 36 hours, risking community safety and user trust. To solve this, Fandom partnered with Coactive AI to implement an AI-powered content moderation solution.

Using Coactive AI's multimodal AI and APIs, Fandom automated the classification of images into categories like gore or nudity. The solution, delivered in just six weeks, assigns each image a score to determine if it should be automatically removed, sent for manual review, or allowed. This new process slashed image review time to under half a second. The implementation by Coactive AI resulted in a 50% cost reduction, a 74% decrease in manual review hours (from 500 to 130 hours per week), and automatically handles 90% of all uploaded images, dramatically improving platform safety and team morale.


Open case study document...

Fandom

Tim Quievryn

Director of Trust, Safety, and Product Support


Coactive AI

1 Case Studies