Cloudflare users can leverage Mistral AI on the Workers AI platform

A Mistral AI Case Study

Case study about Cloudflare working with Mistral AI.

Cloudflare, a leading content delivery network provider, currently serves Mistral 7B through their workers AI, which allows users to run AI models on Cloudflare’s global network. According to Cloudflare’s blog post, ‘For 7 billion parameter models, we can generate close to 4x as many tokens per second with Mistral as we can with Llama, thanks to Grouped-Query attention.’ They further noted that Mistral offers low latency, high throughput, and delivers impressive performance on benchmarks, even when compared to larger models (13B).


Mistral AI

30 Case Studies