Case Study: Fintool achieves 7.4× faster performance and 89% lower inference costs for AI equity research with GroqCloud (Groq)

A Groq Case Study

Preview of the Fintool Case Study

Fintool - Customer Case Study

Fintool is an AI equity-research copilot for institutional investors that uses LLMs to mine terabytes of SEC filings, earnings calls, and other financial documents to surface timely insights. Faced with the need for ultra-low-latency, high-accuracy inference to answer complex financial queries and verify results, Fintool partnered with Groq and adopted GroqCloud as its inference platform (running Llama 3.3 70B) to meet the demands of real-time analysis.

By moving key operations and its multi-agent verification system from OpenAI GPT-4o to Llama 3.3 70B on GroqCloud, Groq enabled a 7.41x throughput increase (from 111 to 823 tokens/sec) and an 89% reduction in cost per token, allowing Fintool to triple its token usage and feed more context into prompts. Groq’s GroqCloud delivery helped Fintool improve speed, accuracy, and efficiency, contributing to the highest score on the Finance Bench and measurable business benefits in latency, cost, and model-driven quality.


Open case study document...

Fintool

Nicolas Bustamante

Chief Technology Officer


Groq

14 Case Studies