Inside Analysis

How to Build Reliable Modern Data Pipelines Using AI and DataOps

Organizations today are building strategic applications using a wealth of internal and external data. For example, they are building Customer 360 applications that combine customer data from multiple business channels, including stores, online, social media, and third party demographic data. They are deploying e-commerce applications that offer personalized shopping experiences and dynamic recommendations using a deep history of customer transactions, interactions, and observations. They are performing proactive maintenance by predicting failures in manufacturing equipment. And the list goes on.

Unfortunately, data-driven applications fail for many reasons and identifying the cause and finding a fix is challenging and time-consuming. This has gotten harder as the complexity, scale, and speed of modern data pipelines increases. Detecting a problem in a complex, distributed system with many integrated components, each of which generates its own log files that generate thousands of non-correlated messages. Hunting for the root cause of an application failure from these messy, raw, and distributed logs is hard for performance experts, and a nightmare for data operations teams tasked with managing modern data pipelines.

In this webcast, data analytics guru Wayne Eckerson will discuss the rise of modern data applications and the processes and tools (i.e. DataOps) required to manage them. He will present a DataOps Framework and show the importance of using testing and monitoring to build, automate, and manage robust data pipelines. In addition, Eric Chu, VP Data Insights from Unravel Data will explain how to apply performance monitoring software and artificial intelligence to your data pipelines and supporting big data systems to keep your applications running reliably.

You Will Learn:
-The role of DataOps in supporting modern data applications
-A DataOps framework for building and managing data pipelines
-The role of testing and monitoring in DataOps
-How AI is needed to manage and monitor complex data pipelines and environments
-How modern performance management software can reduce the risk of running modern data applications



Leave a Reply

Your email address will not be published. Required fields are marked *