Inside Analysis

Archived webcasts

Research Webinar with The Bloor Group and Anaconda As enterprises across all industries embrace the potential of data science and machine learning, they will inevitably face the challenges of operationalizing these processes in their technical environments. They will need to be strategic in choosing investments that align with their objectives. The question of building a bespoke solution versus purchasing a vendor’s platform that accommodates the needs of many is a decision that will be debated in earnest. If you’ve decided your business is ready for a Data Science and Machine Learning platform, join us for this interactive webinar. We’ll examine: -How to determine the baseline requirements for a Data Science platform -What to...    See details...

Adrift at Sea? Not with DataOps!

On: December 03, 2019
The Briefing Room with The Bloor Group and StreamSets In an ocean of data, navigating your enterprise to success can pose serious challenges. As data sources proliferate and competition heats up around the globe, savvy organizations are turning to DataOps to get a handle on their information assets. By leveraging a control plane for data, these companies are able to harness information at-scale, providing significant strategic advantage. Register for this episode of The Briefing Room to hear Bloor Group CEO Eric Kavanagh explain why a formal DataOps program is fast becoming a must-have for any data-driven company. He'll be joined by StreamSets VP of Product, Kirit Basu, who will share his company's vision for enabling the next genera...    See details...
Webinar with The Bloor Group and Microsoft When you know better, you do better. That age-old mantra comes squarely into view today as organizations can now tap into tremendous new sources of information, never before readily available for analysis. Paramount among these? Workplace analytics! Combined with remarkable collaborative capabilities, this new form of behavioral analytics promises to revolutionize the visibility, granularity and scale of insights available to the modern business. Register for the first part of this new Webinar series to hear Bloor Group CEO Eric Kavanagh explain why the nexus of workplace analytics and collaboration creates a truly game-changing opportunity for decisionmakers. He'll be joined by Danielle Gros...    See details...
The Briefing Room with The Bloor Group and Pepperdata Everyone loves options! And these days, savvy analysts can benefit from a whole new world of computational power. While the rumors of on-prem's demise have been exaggerated, there are now multiple cloud platforms where analytics can be run. But what of the cost? the TCO? A survey of 350 decision-makers recently noted that Big Data in the Cloud turned out to be downright expensive. Register for this episode of The Briefing Room to hear Bloor Group CEO Eric Kavanagh and Senior Analyst Evren Sel Cakir explain why a new day is dawning for the power of analysis. They'll be joined by Kirk Lewis of Pepperdata, who will demonstrate innovative software for understanding and effectively mana...    See details...
Briefing Room with Eric Kavanagh and Sauce Labs Whether you’re in retail, technology, finance, or anything in between, chances are yours is a digital business, the success of which depends on continuously delivering fast, visually appealing and functionally flawless web and mobile applications to your customers. Aiming to accelerate release cycles, improve the quality of their applications, and drive a better overall experience for their users, organizations large and small are investing heavily in DevOps. And yet, many of those same organizations are still struggling to achieve quality at speed. This missing ingredient for many? Open source. In this in-depth webcast, Joanna Schloss and Thomas Boyles of Sauce Labs will join us...    See details...
Many great ideas arrive before their time. But wait long enough, and the tumblers will align. That's the case with data virtualization, the practice of provisioning data in unconventional ways. Now nearing its 20th year, this innovative practice is roaring back thanks to a clever assembly of open-source technologies and proprietary techniques. The result? Seamless access to data across silos without ETL. Register for this episode of The Briefing Room to hear Bloor Group Senior Analyst Evren Sel Cakir explain how open-source software has fueled this next generation of data virtualization. He'll be joined by Ian Tinney of Gemini Data, who will explain how his technology was designed to provision both structured and unstructured data virtua...    See details...
Hosted by Eric Kavanagh (The Bloor Group). Sponsored by Syncsort. Today’s data-hungry analytics applications falter when they can’t get the right data at the right speed. Data pipelines are a critical element of success with analytics, yet building and operating data pipelines can be difficult. Today’s data management systems are complex, with data deployed across multiple cloud platforms and in hybrid architectures that combine multi-cloud with on-premises databases. Data pipeline difficulty is further challenged by a shortage of data engineers and increasing needs for real time data. Analytics success and sustainability depends to a large degree on our ability to build data pipelines that are agile, automated, and accurate. ...    See details...
Hosted by Eric Kavanagh (The Bloor Group). Presentation by Wayne Eckerson (The Eckerson Group). Sponsored by Equalum. One of the biggest challenges facing organizations looking to modernize their data infrastructures and pipelines is how to ingest data from dozens, if not hundreds, of new and legacy systems and applications. Most want to adopt a streaming-first architecture, but many legacy systems and target applications still work in batch mode. Also, with large volumes of fast-moving data, it’s mandatory to update targets with changes, not refresh the entire database. But not all ingestion methods support change data capture. Finally, reconciling source data and preparing it for consumption requires a significant amount of transfo...    See details...
Hosted by Eric Kavanagh (The Bloor Group). Presentation by Dave Wells (The Eckerson Group). Sponsored by Infoworks. Today’s analytics use cases are highly data dependent. Analytic models, and especially AI/ML models—are data hungry. They need a reliable supply of trustworthy data to be useful and sustainable. Today’s data management systems are complex and challenging with data deployed across multiple cloud platforms in combination with on-premises databases. Multiple platforms and data silos are barriers to fast and reliable data supply for analytics. The dilemma for modern analytics is this paradox: We need more data and more kinds of data for advanced analytics. But more data and more kinds of data create barriers to operat...    See details...
Software may be eating the world, but only if it has access to the right data! Today's innovators appreciate the critical importance of fueling their applications with the right information at the right time. Environments are no longer on-prem, they are a combination of on-prem and cloud based with major shifts not only to cloud-based databases but a myriad of SaaS applications which store important enterprise data. Connecting to so many systems in so many places calls for a whole new approach to managing data access. Register for this episode of The Briefing Room to hear veteran Analyst William McKnight explain why data access must go beyond the traditional point-to-point architectures of yesteryear. He'll be joined by Nadia Dobriansk...    See details...
The long-fought battle of business and IT is finally giving way to a new era of cooperation, largely thanks to the maturation of self-service analytics. By providing analysts with the tools they need to prepare data efficiently, technical teams can focus on other challenges. The biggest winners are domain experts, who now spend the majority of their time tackling the issues that yield results for the business. Register for this episode of The Briefing Room to hear industry analysts Eric Kavanagh and Donald Farmer explain why the self-service vision is finally coming to fruition. They will be joined by Matt Derda of Trifacta, who will explain how data preparation techniques are solving the dual objectives of educating analysts about thei...    See details...
DataOps promises to accelerate the process of creating and modifying data pipelines while simultaneously improving quality. DataOps will turn what has historically been a hand-crafted discipline into a lights-out, automated data environment that speeds delivery, improves customer satisfaction, and generates business value. But DataOps is a big tent that borrows principles from DevOps, Lean, Agile, and Total Quality Management methodologies. And while DataOps is focused on repeatable operational processes, there is the corresponding automation technology that is helping accelerate many of these processes. The challenge is absorbing these new processes and new technology while the underlying execution ( Hadoop, Spark, Databricks etc.) and...    See details...
How do Fortune 100 companies optimize their big data applications? They gather metrics from more than 300 data sources every five seconds, then use machine learning to scan incredibly complex systems, across on-prem and cloud environments. The insights gleaned allow them to identify mission-critical issues in near-real-time, and then resolve them, sometimes automatically, to keep the big data trains running on time. Register for this episode of The Briefing Room to hear Bloor Group CEO Eric Kavanagh explain why success with big data requires deep visibility into the wide array of scale-out technologies. He'll be joined by Kirk Lewis of Pepperdata, who will discuss how his company's platform collects tremendous amounts of data to acceler...    See details...
The rise of big data, data lakes and the cloud, coupled with increasingly stringent enterprise requirements, are reinventing the role of data warehousing in modern analytics ecosystems. The emerging generation of data warehouses is more flexible, agile and cloud-based than their predecessors, with a strong need for automation and real-time data integration. Join this live webinar to learn: -Typical requirements for data integration -Common use cases and architectural patterns -Guidelines and best practices to address data requirements -Guidelines and best practices to apply architectural patterns REGISTER...    See details...
Over the past few years, the revolution of technology for storing, processing and analyzing data has been staggering. Businesses now have the ability to work with data at a scale and speed that many of us would have never thought possible. So why are so many organizations still struggling to drive meaningful ROI from their analytics and AI investments? The answer starts with the quality of your data. Poor management of the flow of data – raw, diverse, and frequently unstructured – can quickly turn analytics and AI initiatives into failures. As the saying goes: garbage in, garbage out. Without good data preparation technologies and practices optimized for data quality, AI and analytics investments fail to meet the necessary prerequis...    See details...
DataOps is an emerging set of practices, processes, and technologies for building and automating data pipelines to meet business needs quickly. As these pipelines become more complex and development teams grow in size, organizations need better collaboration and development processes to govern the flow of data and code from one step of the data lifecycle to the next – from data ingestion and transformation to analysis and reporting. DataOps is not something that can be implemented all at once or in a short period of time. DataOps is a journey that requires a cultural shift. DataOps teams continuously search for new ways to cut waste, streamline steps, automate processes, increase output, and get it right the first time. The goal is to...    See details...
In the past, an entire organization could be supported with a single data pipeline: the data warehouse. Today, organizations are building multiple pipelines to support dozens of use cases, from reporting and self-service analytics to data science, machine learning, and real-time, customer-facing applications. Managing multiple data pipelines is a complex process that is further complicated with more data sources and targets than ever before. Adding to the management challenge, the range of data technologies available to collect, transform, manage, and deliver data has exploded and continues to evolve. This webcast examines the characteristics of modern data pipelines and provides a framework for building and managing them. It explores th...    See details...
Remember the early days of cellular telephones, when a call routinely started with a few minutes of “Can you hear me now?” Each of those minutes cost the caller without contributing to the desired communication. Today we have a similar situation with analytics when meetings routinely begin with “Where did you get this data?” The time taken when several people—generally high-level and high-salary people—spend several minutes questioning data trustworthiness has a very real cost and contributes little to knowledge, understanding, insight, or decision making. Trusted data is a critical element of analytic culture and data-driven business. Companies today consider data a key asset that influences every aspect of business from st...    See details...
In the realm of complex analysis, rarely does one source of data provide everything the analyst needs. Data Warehouses were designed to pull data from multiple sources, to enable that kind of cross-system discovery. But that traditional model typically required stripping the data of significant context, essentially watering down the end result, and at times obfuscating the most meaningful facets. Thanks to several advances in real-time data exploration, companies can now access raw data where it lives, and begin the analysis process often within seconds of connecting to a source. And new innovations allow for multi-source analytics, where disparate systems can be accessed simultaneously, allowing real-time discovery across multiple sourc...    See details...
Synthesis Webcast with Eric Kavanagh and Eckerson Group Artificial intelligence (AI) is making inroads into every imaginable product and service, including data analytics which spawned it. Vendors are now racing to implement AI capabilities, including natural language queries, automated insights, natural language generation, intelligent data preparation, and auto-visualizations. The question now is how AI will reshape business intelligence (BI). Will AI eliminate the need for traditional reports and dashboards or will it enhance those delivery methods, making them more, rather than less, valuable? Will AI generate right answers in response to spoken or written queries and automatically unearth related drivers and dimensions? Or will i...    See details...
Modernizing data management is on everyone’s mind today. Making the shift from data management practices of the BI era to modern data management is essential but it is also challenging. Whether you’re updating the back end by migrating your data warehouses to the cloud or advancing the front end with a shift from legacy BI tools to self-service analysis and visualization, it is critical to know the data that you have and to understand data lineage. Data inventory, data glossary, and data lineage are all metadata dependent. But legacy BI metadata is typically proprietary, non-integrated, and collected inconsistently by a variety of disparate tools. The metadata muddle is a serious inhibitor to modernization efforts. Metadata consolidatio...    See details...
Organizations today are building strategic applications using a wealth of internal and external data. For example, they are building Customer 360 applications that combine customer data from multiple business channels, including stores, online, social media, and third party demographic data. They are deploying e-commerce applications that offer personalized shopping experiences and dynamic recommendations using a deep history of customer transactions, interactions, and observations. They are performing proactive maintenance by predicting failures in manufacturing equipment. And the list goes on. Unfortunately, data-driven applications fail for many reasons and identifying the cause and finding a fix is challenging and time-consuming. Thi...    See details...
Seeing is believing, and that's especially true in today's world of far-flung enterprise data. Organizations today need to innovate, adapt and scale quickly, which often leads them to modernizing their data infrastructure to include cloud migration and leverage progressive cloud-based tools. But with critical information assets being spread across an ever-expanding array of on-premise and cloud environments, how can you effectively manage this dynamic new reality? Check out this episode of The Briefing Room to hear Bloor Group CEO Eric Kavanagh explain the significance of data catalogs for the future sanity of information professionals everywhere. He'll be joined by Peter Princen of Collibra who will discuss the rise of 'data citizens' ...    See details...
The race is on! Companies the world over are fast realizing the remarkable power of predictive analytics. But getting effective models designed and deployed typically takes a lot of time, effort and cost. That's now changing, largely due to the combination of automation and machine learning. How can your company get the upper hand? Check out this episode of The Briefing Room to hear veteran Analyst Dr. Peter Went explain why we're entering a whole new phase of enterprise intelligence. He'll be joined by Greg Michaelson of DataRobot, who will demonstrate how the design and deployment of predictive models can be automated these days, enabling companies to significantly expand the business benefit of their data. He’ll showcase an array of...    See details...
The power of analytics cannot be denied, but the immediacy of effective action tells the tale of success. Solutions that enable data-driven decisions will tend to produce positive results on track with how quickly and efficiently professionals can change a business process upon finding relevant insights. The longer that latency, the lower the value. That's why operational intelligence can provide such bang for the buck: the benefits can accrue right away. Register for this special episode of The Briefing Room to hear Bloor Group CEO Eric Kavanagh explain why analytics is no longer just for the elite. He'll demonstrate why "BI for the Masses" has finally arrived, and what that means for your organization. He'll be joined by Shane Swiderek...    See details...
Every so often, a sea change occurs in the world of technology. The dynamics fundamentally change, and innovators carve out new ways of doing business. The Cloud certainly represents such a transformative power, but so does the rise of containers. In a few short years, containers have evolved into a formidable toolset for enterprise IT, especially in the domain of hybrid cloud. Register for this episode of How Data Works to hear Bloor Group Data Scientist, Dr. Geoffrey Malafsky, explain how containers will fuel the great migration from on-prem to hybrid and multi-cloud scenarios. He'll be joined by Shane Kumpf and Billie Rinaldi of Hortonworks, who will demonstrate how containers can play an integral role in the next generation of busine...    See details...
If it's worth doing, it's worth doing right! That maxim applies to all manner of business solutions, but it's especially applicable for data governance. Traditionally, governing data was mostly done at the source (database) or application layer (edge), but both options were inefficient for different reasons. The hard truth is that actual data governance requires a comprehensive solution that reaches across, and through all significant information systems. Register for this episode of The Briefing Room to hear Bloor Group GRC Advisor, Dr. Peter Went, explain why "edge-to-edge" data governance is a recipe for success. He'll be briefed by Danny Sandwell of erwin, who will explain how his company's recent acquisitions have enabled enterprise...    See details...
Everyone wants more data these days, and who can blame them? The insight economy is now in full swing. But despite widespread innovation in analytics and big data processing, the nuts and bolts of accessing data haven't changed in 20 years... until now. Register for this special episode of The Briefing Room to learn about a new way of connecting information systems and sources faster, more reliably, with less burden on your internal teams. Veteran Analyst Mike Ferguson will explain why a new approach to data connectivity is required for success in the Information Economy. He'll be joined by Magnitude Software General Manager Tony Fisher, alongside Senior Product Manager, Craig Chaplin, who will showcase Magnitude Gateway, a new platform ...    See details...
The tipping point already passed, and the major analyst firms agree: The future is hybrid cloud, and it's already here. The question now is how to manage this multifaceted world of enterprise computing. Central to the strategy of most forward-looking companies is a data fabric which provides the backbone for a new generation of business applications that are fast, scalable, and secure. What does all that look like? Register for this episode of 'How Data Works' to hear veteran Data Scientist Dr. Geoffrey Malafsky explain why a data fabric is central to this next era of computing. He'll be joined by Scott Clinton of Hortonworks who will share his company's vision, which will support all kinds of applications, including classic stateful, an...    See details...
Many companies today want to embed reports and analytics into applications. Embedded BI delivers critical data to business users in the context of their core applications. It also improves the value of customer-facing applications, addressing one of the key points in the market today: customer experience. Register for this special Eckerson Group Webinar to learn about the growing trend of embedding analytics into operational systems. Analyst Wayne Eckerson will offer insights from his firm's years of research into this space. He'll be joined by Gene Arnold of TIBCO Jaspersoft, who will demonstrate how his company enables agility by embedding valuable analytics into operational systems. REGISTER...    See details...
Most BI products now incorporate machine learning to automate manual tasks, such as blending data, crafting queries, generating dashboards, performing root cause analysis, conducting simulations, and finding related reports. Most also run imported predictive models and perform regressions out of the box. The question is, does all this artificial intelligence make BI users more or less productive? Will it increase or decrease their analytical IQ as they become dependent on the machine for answers? Will users trust machine output to their own analysis or that of a trusted colleague? This webcast will explore these and other issues emanating from the new era of Smart Analytics. What You Will Learn: -Will AI enhance BI or create new p...    See details...

Is AI the New BI?

On: November 14, 2018
Thanks to artificial intelligence (AI), prediction and prescription are fast eclipsing description (i.e. business intelligence) as the most sought-after data deliverable. The question now is whether AI will replace BI or complement it. Is there a separate and distinct role for BI apart from AI? Or will the two disciplines merge into an AI-infused analytic application? This webcast will track the emergence of AI and debate its role within the spectrum of analytic capabilities. It will discuss how BI and analytics vendors are ushering in the AI era by incorporating rules and machine learning into BI tools. The goal of AI-infused BI is to automate routine tasks, make self-service easier, and improve data literacy. However, once AI starts co...    See details...
Jim Barksdale, former CEO at Netscape, once said “If we have data let’s look at data. If all we have is opinions, let’s go with mine.” The message is clear, but having data is only the beginning, Your analysts need to find the right data, understand the data, and deliver timely insights at reasonable cost. That’s where data catalogs make a real difference. The value and benefits of a data catalog are often described as the ability for analysts to find the data that they need quickly and efficiently. Data cataloging accelerates analysis by minimizing the time and effort that analysts spend finding and preparing data. Anecdotally it is said that 80% of self-service analysis without a data catalog is spent getting the data ready ...    See details...
Metadata remains the cornerstone of information management. It's not just the glue that holds systems together; it enables practically every transaction in today's data-driven enterprise. From reporting and analysis to governance and security, metadata holds the key to accuracy and context. But manual management of this critical resource can cause serious issues. That's why automated metadata management is now taking center stage. Register for this episode of The Briefing Room to hear veteran Data Scientist, Dr. Geoffrey Malafsky explain why metadata is more critical than ever in the age of big data. He'll be joined by Amnon Drori of Octopai, who will demonstrate how his company's software is revolutionizing the once-unwieldy process of ...    See details...
Recent developments in data management, big data, data lakes, NoSQL, Hadoop and the cloud, enable businesses to gain more insights faster. This new age of data warehousing has a high emphasis on self-service and global access to data, raising new challenges. How can we optimize workloads originated from self-service applications? How can we maintain data warehouse health as it is leveraged by more diverse analytics? How can we ensure scalability without having to increase IT resources? To enable self service and modernize with confidence, modern tools will be needed to increase visibility into workloads, automate health checks and workload tuning, and introduce self-service troubleshooting to the user community. You will learn: -The b...    See details...
The emergence of big data analytics and the adoption of data lake architecture cause many to question the future of data warehousing. Yet recent surveys show that more than 60% of companies are still operating between 2 and 5 data warehouses. Many people have talked about eliminating the data warehouse altogether. But the reality is that the data warehouse offers value that the data lake doesn’t address and — vice versa. The real challenge is how to combine the data warehouse with a data lake architecture, modern data pipelines, and analytics use cases. Join us to learn how automation and agile data engineering step up to the challenges of data warehouse modernization. You Will Learn: -The key challenges of legacy data warehousing ...    See details...
The modern world of data won't fit into a lake, let alone a warehouse. Valuable data sources have proliferated to the point where the concept of a centralized repository for all enterprise data no longer makes much sense. The key today is visibility into -- and across -- vast data sets; combined with the ability to visualize and query disparate data at the speed of business. Register for this episode of The Briefing Room to hear Bloor Group CEO Eric Kavanagh explain how the world of business intelligence has changed, and why a new approach is necessary. He'll be joined by Ruhollah Farchtchi and Mary Flynn of Zoomdata who will demonstrate how their company's platform was designed from the ground up to handle the speed, variety and comple...    See details...
Rome wasn’t built in a day, but its architects understood the need for long-term strategic planning. The same holds true for modern data architectures, which must durably span on-prem data centers and the ever-expanding array of cloud providers. That's no small challenge in a world burgeoning with big data, vexed by global hackers, pressed by expanding regulations, and driven by the demands of both public and private companies in highly competitive markets. Register for this inaugural episode of How Data Works to learn from Hortonworks VP of Marketing and Sales Scott Clinton, who will explain why smart information architecture must underpin the creation and management of an enterprise data fabric. He'll be joined by Bloor Group CEO an...    See details...
Modern data pipelines are more complex than traditional data warehousing pipelines. They must support a multiplicity of data sources, data types, and use cases; they must be designed and deployed in an agile manner; they must support batch, mini-batch, and streaming data flows, and, they must meet service level agreements for data quality, availability, and consistency. Many organizations are turning to data catalogs to support modern data pipelines. Data catalogs can automate the tagging and organization of ingested content and derived data sets, harmonize data flowing through multiple new and existing pipelines, support data curation processes, and enable business users to explore content for analysis. This Webinar will explore the rol...    See details...
Timing really is everything, especially in a highly connected world. As the Internet of Things continues to reshape every industry, today's most forward-thinking organizations are discovering the remarkable power of edge computing. By capturing, analyzing and acting on data at the edge, these innovators are solving business challenges as they arise, and capitalizing on opportunities in near-real-time. The critical success factor is a robust, optimized information architecture. Register for this episode of How Data Works to hear Bloor Group CEO Eric Kavanagh explain how the combination of edge computing and real-time intelligence creates a whole new solution set for organizations to leverage. He'll be joined by Dinesh Chandrasekhar from H...    See details...
The results are in, and businesses around the world concur: data is the key to improving user experience, the #1 concern of senior managers globally. More to the point, data that leads to valuable customer insights is mission critical. However, most companies admit they lack the necessary tools, technologies and best practices to consistently deliver trusted data to stakeholders. What's a data-driven organization to do? Register for this episode of The Briefing Room to learn how today's innovative enterprises orchestrate increasingly powerful technologies and methods for ensuring the delivery of timely, trusted insights. He'll be joined by Erin Haselkorn of Experian, who will share results from a recent survey that canvassed a wide range...    See details...
Architecture matters. That's why today's innovators are taking a hard look at streaming data, an increasingly attractive option that can transform business in several ways: replacing aging data ingestion techniques like ETL; solving long-standing data quality challenges; improving business processes ranging from sales and marketing to logistics and procurement; or any number of activities related to accelerating data warehousing, business intelligence and analytics. Register for this DM Radio Deep Dive Webinar to learn how streaming data can rejuvenate or supplant traditional data management practices. Host Eric Kavanagh will explain how streaming-first architectures can relieve data engineers from time-consuming, error-prone processes, ...    See details...
The transformative power of machine learning cannot be denied, but achieving business value has proven elusive for many organizations. There are several key challenges in the process: a) deciding on the right process to augment; b) determining the optimal algorithms for the use case; c) gathering enough quality data to train the algorithms; and, last but not least, d) operationalizing the results of machine learning. Register for this episode of The Briefing Room to hear Bloor Group CEO Eric Kavanagh explain why machine learning projects must embrace the end-to-end process in order to be effective. He'll be joined by Harish Doddi of Datatron, who will share his company's recent experiences in deploying machine learning modules for sever...    See details...
Popularized by Facebook, Google and LinkedIn, graph technology now enjoys widespread recognition in the field of data analytics and discovery. Unlike relational or column-store databases, graph solutions are optimized for understanding relationships between entities, which greatly facilitates the analysis of vast data sets. In an increasingly connected world, this kind of technology holds tremendous promise for solving complex business challenges in real-time. Register for this episode of The Briefing Room to hear Bloor Group CEO Eric Kavanagh explain how a wave of innovation in graph technology is reshaping the world of analytics at-scale. He'll be joined by Gaurav Deshpande of TigerGraph, who will share details of his company's techni...    See details...
Traditional data warehousing has long used batch jobs to move, load and transform data for decision making. But as data volumes rise and the velocity of business grows, more organizations are opting to move and process data in real-time or near real-time. Batch processing is giving way to mini-batches fueled by replication and change data capture as well as stream processing in which events are captured, processed, and analyzed as they happen. Many companies today have a mix of batch, mini-batch, and stream-based processing. The questions is whether organizations should embrace streaming as the default mode of data acquisition? Several vendors are now pitching streaming-first architectures and extolling the benefits of processing data i...    See details...
The problem with traditional data pipelines based on extract, transform, and load (ETL) tools that populate data warehouses and data marts is that power users quickly bump up against their dimensional boundaries. To answer urgent questions, they are forced to download data from data warehouses into Excel or another desktop tool and combine with data acquired elsewhere. The result is a suboptimal spreadmart. Today, organizations build modern data pipelines to support a variety of use cases. Besides data warehouses, modern data pipelines generate data marts, data science sandboxes, data extracts, data science applications, and various operational systems. These pipelines often support both analytical and operational applications, structure...    See details...
The data lake is a repository of corporate data. So how is that different from a data warehouse? Where does a data lake leave off and data warehouse begin, and vice versa? Are these mutually exclusive environments or long-lost companions in a future data nirvana? Using our point-counterpoint approach, a team of seasoned data experts will debate the relationship between data lakes and data warehouses and come to a consensus opinion. This Socratic approach will help you understand: -How a data lake differs from a data warehouse -What workloads belong in a data lake versus a data warehouse -How the dividing line between the two environments is getting fuzzier -How to build a modern analytics ecosystem that leverages the best of data l...    See details...
As the information economy plows forward, successful organizations are harnessing many more sources of increasingly diverse data. The new center of gravity is cloud, with thousands of web-based applications providing highly focused business value, and generating useful data in the process. As such, the modern data warehouse has evolved dramatically: it's faster, more dynamic, and requires a much more agile solution for ingesting data at-scale. Register for this episode of Synthesis to hear Bloor Group CEO Eric Kavanagh examine the evolving role of data in a cloud-enabled world. He'll be joined by Dewayne Washington of Eckerson Group who will outline ways to accelerate the design and deployment of modern data solutions. And Taylor Brown ...    See details...
The information supply chain now extends around the globe, connecting people and organizations in truly remarkable ways. Today's most forward-looking companies are harnessing diverse data sets to create highly contextualized products and services, while optimizing workflows to expedite production and delivery. By blending a critical mass of real world data, these organizations are benefiting from a new kind of intelligence which can lead to tremendous business value. Register for this inaugural episode of Real World Data to hear Analyst Eric Kavanagh explain why this 'new intelligence' will fundamentally change business as usual. He'll be joined by Helen Arnold, President of the SAP Data Network, who will demonstrate how her company's vi...    See details...
The data warehouse promised to deliver a single version of truth. But skeptics abound, saying a single version of truth is a mirage and not necessary. Yet, how can an organization exist as a holistic entity if it doesn’t have a common set of metrics and data for capturing, analyzing, and interpreting business activity? Using our point-counterpoint approach, a team of seasoned data experts will debate the conceptual and technical feasibility of the single version of truth and come to a consensus opinion. This Socratic approach will help you understand: -Why a single version of truth (SVOT) is still required -When an organization can ignore the SVOT imperative -How to balance SVOT and MVOT (multiple versions of truth) within a singl...    See details...

Bloor group research

Multimedia Research Program by The Bloor Group What's bigger than big data? Artificial intelligence! That's because AI will soon transform practically every business process in the world, in part due to the abundance of big data, and specifically ...
Data may be the new oil, but every sword cuts both ways. Just as with any valuable asset, there are liabilities associated with data. The key to success in today's world is to ensure that you have sound policies in place for managing data, especi...
Database is the new black. Ever the backbone of information management architectures, database technology continually evolves to meet growing and changing business needs. New types of data and applications make the database more important then ever, ...
A sea change is underway. The old constraints are fading, while a bold new vision unfolds. The Data Lake represents more than another type of information reservoir. It embodies a fundamental change in the data management industry itself. Tools, techn...
This primary research project, conducted by David Loshin and Abie Reifer of DecisionWorx in conjunction with The Bloor Group, will examine the real-world use of technologies and solutions that comprise the Hadoop/Big Data Management enterprise enviro...

Infographics