The client brought a billion rows of data… from a variety of sources. Never had this data been aggregated, let alone thoroughly analyzed. But AdLucent was up to the challenge: They pulled all billion rows into a cloud data warehouse, then used an array of tools to create a robust customer churn model. How did they did it?
Check out this episode of The Briefing Room to find out! You’ll hear Kelly Weeks, Analytics Director of AdLucent, explain the step-by-step process of loading and analyzing over a billion rows of customer data. She’ll be joined by Tyrone Schiff of Google Cloud who will offer insights about cost optimization with AI and BigQuery. Attendees will learn:
• How to ingest a billion rows of data efficiently
• The steps for building modern data pipelines
• How to spin up large instances quickly, then spin them down
• How to delineate the law of diminishing returns