Case Study: Notion achieves scalable AI and lower operations costs with Confluent

A Confluent Case Study

Preview of the Notion Case Study

How Confluent Powers 100M+ Users Daily

Notion, the connected workspace for docs, notes, projects, and knowledge, faced rapid growth that pushed its legacy messaging and event-logging architecture beyond its scaling limits. As user volume surged past 100 million daily users, the team needed a more cost-efficient, real-time data foundation to support analytics, AI features, and integrations without overloading a lean engineering team. Confluent provided the data streaming platform used to replace that legacy approach.

With Confluent, Notion moved to a fully managed, event-driven architecture on Apache Kafka, using pre-built connectors, Schema Registry, and stream processing to route data into systems like AWS, Amazon S3, Snowflake, and PostgreSQL. The result was faster innovation, stronger real-time AI capabilities for Notion AI, and lower operational burden—Notion reported huge time savings and tripled productivity with Confluent, while also enabling scalable event processing for over 100 million users.


View this case study…

Notion

Ekanth Sethuramalingam

Engineering Lead


Confluent

102 Case Studies