I was honored to co-present “Intelligent Ingest for a Modern Data Architecture” on the Briefing Room last week with Matthew Halliday, EVP of Product for Incorta, and host Eric Kavanaugh.

Companies are keenly focused on specializing their offerings via analytics. At the same time, the potential sources of useful data are growing both in number and complexity. One viable answer to these converging forces is intelligent ingest, a new type of data processing that vastly expedites onboarding.

Ingest has always been the most time consuming, error-prone and unpredictable category of a pipeline. Options have included ETL, ELT, streaming, and the category of focus for today – direct data mapping. Matthew demonstrated how Incorta solves the challenge with intelligent ingest as they help customers go from raw data to advanced analytics in hours.

The Incorta Data Platform has been around for many years. It is like data virtualization with a 3NF in-memory, compressed database on Parquet. It short-circuits data flows and the engine has direct data mapping.

However, today we were discussing the relatively new Incorta Intelligent Ingest product. Incorta has done the hard work around numerous ingests (Oracle EBS used as an example with its 5000+ tables) and makes all that available in metadata-based Blueprints. Incorta directly maps to the data (no expensive joins) and uses a parallel ingest processes to load the largest of tables (think billions of rows) in minutes and features a fast incremental refresh and high-performance queries for insight.

Matthew said you can deploy your models in a single day with without even worrying about all the steps it would take to build. With a single line of code, Intelligent Ingest is also ideal for businesses who desire a low/no-code or SDE (software-defined everything) environment. 

The data is stored in Parquet. This is advantageous since the Parquet columnar approach has been proven to be the best for most analytical query sets.

I followed with my presentation on information architecture, focusing on the architecture components – specifically data warehouses and data lakes – where the ingest lands its data. I talked about the new full data stacks that now include data engineering, data analytics, data science, data catalog, workload management, data movement, overarching deployment tools and overarching security. I discussed the relative costs of the stack components and how the balance of analytics is changing over time (between the data warehouse and the data lake). 

With intelligent ingest into a solid architecture an enterprise is putting several strong feet forward into business agility and efficient support for modern use cases.

About William McKnight

William McKnight functions as Strategist, Lead Enterprise Information Architect, and Program Manager for complex, high-volume full life-cycle implementations worldwide utilizing the disciplines of data warehousing, master data management, business intelligence, data quality and operational business intelligence. Many of his clients have gone public with their success story. William is a Southwest Entrepreneur of the Year Finalist, a frequent best practices judge, has authored hundreds of articles and white papers and given many international keynotes and public seminars. His team’s implementations from both IT and consultant positions have won Best Practices awards. William is a former IT VP of a Fortune company, a former engineer of DB2 at IBM and holds an MBA. William is author of the book 90 Days to Success in Consulting. Follow him on Twitter at @williammcknight or via email at [email protected].