Case Study: MeetKai achieves edge-personalized, low-latency suggested searches with Fauna and Cloudflare Workers

A Fauna Case Study

Preview of the MeetKai Case Study

Building the next generation search engine with Fauna and Cloudflare Workers

MeetKai, a company building a next‑generation multilingual, multi‑turn voice assistant and search engine, faced the challenge of delivering highly personalized top‑10 search suggestions at low latency while scaling across GPU batch training clusters and serverless search clusters. To bridge dynamic, auto‑scaling compute and provide durable, queryable personalization data at the edge, MeetKai adopted Fauna as the core datastore (working with Cloudflare Workers at the edge) to reliably store personalization factors and interaction records.

By using Fauna to hold precomputed personalization‑factor documents and session interaction records and Cloudflare Workers to cache and inject those factors at the edge, MeetKai serves personalized suggestions without sending PII to its backend, logs click interactions back into Fauna for batch model training, and quickly reloads updated factors to the edge. Fauna’s role in persisting personalization data across ephemeral clusters reduced backend load and latency, enabled higher click‑through rates from more relevant suggestions, and helped MeetKai accelerate time‑to‑market with predictable costs and performance.


Open case study document...

Fauna

9 Case Studies