Coactive AI
1 Case Studies
A Coactive AI Case Study
Fandom, the world's largest fan platform, faced a significant challenge in moderating the vast amount of images uploaded by its 350 million monthly users. While only 0.5% of their 2.2 million monthly uploads were malicious, the scale required a costly manual review process that took contractors 500 hours per week. This slow system left harmful content visible for up to 36 hours, risking community safety and user trust. To solve this, Fandom partnered with Coactive AI to implement an AI-powered content moderation solution.
Using Coactive AI's multimodal AI and APIs, Fandom automated the classification of images into categories like gore or nudity. The solution, delivered in just six weeks, assigns each image a score to determine if it should be automatically removed, sent for manual review, or allowed. This new process slashed image review time to under half a second. The implementation by Coactive AI resulted in a 50% cost reduction, a 74% decrease in manual review hours (from 500 to 130 hours per week), and automatically handles 90% of all uploaded images, dramatically improving platform safety and team morale.
Tim Quievryn
Director of Trust, Safety, and Product Support