Fauna
9 Case Studies
A Fauna Case Study
MeetKai, a company building a next‑generation multilingual, multi‑turn voice assistant and search engine, faced the challenge of delivering highly personalized top‑10 search suggestions at low latency while scaling across GPU batch training clusters and serverless search clusters. To bridge dynamic, auto‑scaling compute and provide durable, queryable personalization data at the edge, MeetKai adopted Fauna as the core datastore (working with Cloudflare Workers at the edge) to reliably store personalization factors and interaction records.
By using Fauna to hold precomputed personalization‑factor documents and session interaction records and Cloudflare Workers to cache and inject those factors at the edge, MeetKai serves personalized suggestions without sending PII to its backend, logs click interactions back into Fauna for batch model training, and quickly reloads updated factors to the edge. Fauna’s role in persisting personalization data across ephemeral clusters reduced backend load and latency, enabled higher click‑through rates from more relevant suggestions, and helped MeetKai accelerate time‑to‑market with predictable costs and performance.