Case Study: Asurion achieves large-scale ETL efficiency with Databricks

A Databricks Case Study

Preview of the Asurion Case Study

Large Scale ETL and Lakehouse Implementation at Asurion

Asurion, a large provider of insurance, repair, replacement, and support services for consumer tech, needed to unify and scale its enterprise data platform across thousands of tables, views, reports, dashboards, and diverse ingestion sources. Its legacy ETL and warehouse stack was costly and rigid, with batch/speed-layer duplication, difficult reprocessing, and limited ability to handle growing data volumes and lower-latency needs, so the team turned to Databricks and a lakehouse architecture.

Using the Databricks Data Intelligence Platform with Apache Spark Structured Streaming, Delta Lake, Auto Loader, and Databricks SQL, Asurion built a more flexible ETL framework, improved scheduling efficiency, and supported always-on SQL-based data marts. The result was the ability to run up to 1,000 slow-changing tables on a single 25-node cluster and expand to 600 production data marts, while reducing platform complexity and improving scalability, cost efficiency, and near-real-time data processing.


View this case study…

Asurion

Tomasz Magdanski

Senior Director of Engineering


Databricks

457 Case Studies