Fireworks AI
7 Case Studies
A Fireworks AI Case Study
Sourcegraph, a provider of enterprise-grade code search and AI tools for developers, needed a scalable and cost-effective platform to integrate multiple large language models (LLMs) for real-time code assistance. Their challenge was achieving the sub-second latency required for features like code completion while managing costs and complex context handling, leading them to partner with vendor Fireworks AI.
The solution from Fireworks AI involved providing a flexible platform with high-performance inference optimizations like Flash Attention-2 and support for open-source models. This collaboration resulted in a 30% reduction in latency, a 2.5x increase in code acceptance rates, and a 40% increase in supported context length, allowing Sourcegraph to deliver a faster, higher-quality experience for enterprise developers.
Beyang Liu
Chief Technical Officer