Case Study: Levity achieves automated, scalable MLOps and Kubernetes model serving with Valohai

A Valohai Case Study

Preview of the Levity Case Study

Custom models for automating image and document processing

Levity helps businesses automate image and document processing with custom classifiers, but ran into costly complexity when trying to build and maintain their own MLOps stack (Kubeflow on GKE) to manage model training, serving, and Kubernetes deployments on Google Cloud. To avoid hiring a full-time MLOps engineer and reinventing infrastructure, Levity adopted the Valohai platform to manage their machine learning infrastructure and model serving through Kubernetes.

Valohai onboarded Levity’s existing GCP compute and storage, provided hands‑on setup and support, and exposed APIs that let Levity trigger training, evaluate results, and manage endpoint deployments without touching Kubernetes manifests. As a result, Valohai eliminated the need for Levity to build and maintain a custom MLOps solution, accelerated their model deployment workflow, and freed budget and engineering capacity to focus on product development and customer integrations.


Open case study document...

Levity

Thilo Huellmann

Chief Technology Officer & Co-Founder


Valohai

18 Case Studies