Inside Analysis

Archived webcasts

DataOps promises to accelerate the process of creating and modifying data pipelines while simultaneously improving quality. DataOps will turn what has historically been a hand-crafted discipline into a lights-out, automated data environment that speeds delivery, improves customer satisfaction, and generates business value. But DataOps is a big tent that borrows principles from DevOps, Lean, Agile, and Total Quality Management methodologies. And while DataOps is focused on repeatable operational processes, there is the corresponding automation technology that is helping accelerate many of these processes. The challenge is absorbing these new processes and new technology while the underlying execution ( Hadoop, Spark, Databricks etc.) and...    See details...
How do Fortune 100 companies optimize their big data applications? They gather metrics from more than 300 data sources every five seconds, then use machine learning to scan incredibly complex systems, across on-prem and cloud environments. The insights gleaned allow them to identify mission-critical issues in near-real-time, and then resolve them, sometimes automatically, to keep the big data trains running on time. Register for this episode of The Briefing Room to hear Bloor Group CEO Eric Kavanagh explain why success with big data requires deep visibility into the wide array of scale-out technologies. He'll be joined by Kirk Lewis of Pepperdata, who will discuss how his company's platform collects tremendous amounts of data to acceler...    See details...
The rise of big data, data lakes and the cloud, coupled with increasingly stringent enterprise requirements, are reinventing the role of data warehousing in modern analytics ecosystems. The emerging generation of data warehouses is more flexible, agile and cloud-based than their predecessors, with a strong need for automation and real-time data integration. Join this live webinar to learn: -Typical requirements for data integration -Common use cases and architectural patterns -Guidelines and best practices to address data requirements -Guidelines and best practices to apply architectural patterns REGISTER...    See details...
Over the past few years, the revolution of technology for storing, processing and analyzing data has been staggering. Businesses now have the ability to work with data at a scale and speed that many of us would have never thought possible. So why are so many organizations still struggling to drive meaningful ROI from their analytics and AI investments? The answer starts with the quality of your data. Poor management of the flow of data – raw, diverse, and frequently unstructured – can quickly turn analytics and AI initiatives into failures. As the saying goes: garbage in, garbage out. Without good data preparation technologies and practices optimized for data quality, AI and analytics investments fail to meet the necessary prerequis...    See details...
DataOps is an emerging set of practices, processes, and technologies for building and automating data pipelines to meet business needs quickly. As these pipelines become more complex and development teams grow in size, organizations need better collaboration and development processes to govern the flow of data and code from one step of the data lifecycle to the next – from data ingestion and transformation to analysis and reporting. DataOps is not something that can be implemented all at once or in a short period of time. DataOps is a journey that requires a cultural shift. DataOps teams continuously search for new ways to cut waste, streamline steps, automate processes, increase output, and get it right the first time. The goal is to...    See details...
In the past, an entire organization could be supported with a single data pipeline: the data warehouse. Today, organizations are building multiple pipelines to support dozens of use cases, from reporting and self-service analytics to data science, machine learning, and real-time, customer-facing applications. Managing multiple data pipelines is a complex process that is further complicated with more data sources and targets than ever before. Adding to the management challenge, the range of data technologies available to collect, transform, manage, and deliver data has exploded and continues to evolve. This webcast examines the characteristics of modern data pipelines and provides a framework for building and managing them. It explores th...    See details...
Remember the early days of cellular telephones, when a call routinely started with a few minutes of “Can you hear me now?” Each of those minutes cost the caller without contributing to the desired communication. Today we have a similar situation with analytics when meetings routinely begin with “Where did you get this data?” The time taken when several people—generally high-level and high-salary people—spend several minutes questioning data trustworthiness has a very real cost and contributes little to knowledge, understanding, insight, or decision making. Trusted data is a critical element of analytic culture and data-driven business. Companies today consider data a key asset that influences every aspect of business from st...    See details...
In the realm of complex analysis, rarely does one source of data provide everything the analyst needs. Data Warehouses were designed to pull data from multiple sources, to enable that kind of cross-system discovery. But that traditional model typically required stripping the data of significant context, essentially watering down the end result, and at times obfuscating the most meaningful facets. Thanks to several advances in real-time data exploration, companies can now access raw data where it lives, and begin the analysis process often within seconds of connecting to a source. And new innovations allow for multi-source analytics, where disparate systems can be accessed simultaneously, allowing real-time discovery across multiple sourc...    See details...
Synthesis Webcast with Eric Kavanagh and Eckerson Group Artificial intelligence (AI) is making inroads into every imaginable product and service, including data analytics which spawned it. Vendors are now racing to implement AI capabilities, including natural language queries, automated insights, natural language generation, intelligent data preparation, and auto-visualizations. The question now is how AI will reshape business intelligence (BI). Will AI eliminate the need for traditional reports and dashboards or will it enhance those delivery methods, making them more, rather than less, valuable? Will AI generate right answers in response to spoken or written queries and automatically unearth related drivers and dimensions? Or will i...    See details...
Modernizing data management is on everyone’s mind today. Making the shift from data management practices of the BI era to modern data management is essential but it is also challenging. Whether you’re updating the back end by migrating your data warehouses to the cloud or advancing the front end with a shift from legacy BI tools to self-service analysis and visualization, it is critical to know the data that you have and to understand data lineage. Data inventory, data glossary, and data lineage are all metadata dependent. But legacy BI metadata is typically proprietary, non-integrated, and collected inconsistently by a variety of disparate tools. The metadata muddle is a serious inhibitor to modernization efforts. Metadata consolidatio...    See details...
Organizations today are building strategic applications using a wealth of internal and external data. For example, they are building Customer 360 applications that combine customer data from multiple business channels, including stores, online, social media, and third party demographic data. They are deploying e-commerce applications that offer personalized shopping experiences and dynamic recommendations using a deep history of customer transactions, interactions, and observations. They are performing proactive maintenance by predicting failures in manufacturing equipment. And the list goes on. Unfortunately, data-driven applications fail for many reasons and identifying the cause and finding a fix is challenging and time-consuming. Thi...    See details...
Seeing is believing, and that's especially true in today's world of far-flung enterprise data. Organizations today need to innovate, adapt and scale quickly, which often leads them to modernizing their data infrastructure to include cloud migration and leverage progressive cloud-based tools. But with critical information assets being spread across an ever-expanding array of on-premise and cloud environments, how can you effectively manage this dynamic new reality? Check out this episode of The Briefing Room to hear Bloor Group CEO Eric Kavanagh explain the significance of data catalogs for the future sanity of information professionals everywhere. He'll be joined by Peter Princen of Collibra who will discuss the rise of 'data citizens' ...    See details...
The race is on! Companies the world over are fast realizing the remarkable power of predictive analytics. But getting effective models designed and deployed typically takes a lot of time, effort and cost. That's now changing, largely due to the combination of automation and machine learning. How can your company get the upper hand? Check out this episode of The Briefing Room to hear veteran Analyst Dr. Peter Went explain why we're entering a whole new phase of enterprise intelligence. He'll be joined by Greg Michaelson of DataRobot, who will demonstrate how the design and deployment of predictive models can be automated these days, enabling companies to significantly expand the business benefit of their data. He’ll showcase an array of...    See details...
The power of analytics cannot be denied, but the immediacy of effective action tells the tale of success. Solutions that enable data-driven decisions will tend to produce positive results on track with how quickly and efficiently professionals can change a business process upon finding relevant insights. The longer that latency, the lower the value. That's why operational intelligence can provide such bang for the buck: the benefits can accrue right away. Register for this special episode of The Briefing Room to hear Bloor Group CEO Eric Kavanagh explain why analytics is no longer just for the elite. He'll demonstrate why "BI for the Masses" has finally arrived, and what that means for your organization. He'll be joined by Shane Swiderek...    See details...
Every so often, a sea change occurs in the world of technology. The dynamics fundamentally change, and innovators carve out new ways of doing business. The Cloud certainly represents such a transformative power, but so does the rise of containers. In a few short years, containers have evolved into a formidable toolset for enterprise IT, especially in the domain of hybrid cloud. Register for this episode of How Data Works to hear Bloor Group Data Scientist, Dr. Geoffrey Malafsky, explain how containers will fuel the great migration from on-prem to hybrid and multi-cloud scenarios. He'll be joined by Shane Kumpf and Billie Rinaldi of Hortonworks, who will demonstrate how containers can play an integral role in the next generation of busine...    See details...
If it's worth doing, it's worth doing right! That maxim applies to all manner of business solutions, but it's especially applicable for data governance. Traditionally, governing data was mostly done at the source (database) or application layer (edge), but both options were inefficient for different reasons. The hard truth is that actual data governance requires a comprehensive solution that reaches across, and through all significant information systems. Register for this episode of The Briefing Room to hear Bloor Group GRC Advisor, Dr. Peter Went, explain why "edge-to-edge" data governance is a recipe for success. He'll be briefed by Danny Sandwell of erwin, who will explain how his company's recent acquisitions have enabled enterprise...    See details...
Everyone wants more data these days, and who can blame them? The insight economy is now in full swing. But despite widespread innovation in analytics and big data processing, the nuts and bolts of accessing data haven't changed in 20 years... until now. Register for this special episode of The Briefing Room to learn about a new way of connecting information systems and sources faster, more reliably, with less burden on your internal teams. Veteran Analyst Mike Ferguson will explain why a new approach to data connectivity is required for success in the Information Economy. He'll be joined by Magnitude Software General Manager Tony Fisher, alongside Senior Product Manager, Craig Chaplin, who will showcase Magnitude Gateway, a new platform ...    See details...
The tipping point already passed, and the major analyst firms agree: The future is hybrid cloud, and it's already here. The question now is how to manage this multifaceted world of enterprise computing. Central to the strategy of most forward-looking companies is a data fabric which provides the backbone for a new generation of business applications that are fast, scalable, and secure. What does all that look like? Register for this episode of 'How Data Works' to hear veteran Data Scientist Dr. Geoffrey Malafsky explain why a data fabric is central to this next era of computing. He'll be joined by Scott Clinton of Hortonworks who will share his company's vision, which will support all kinds of applications, including classic stateful, an...    See details...
Many companies today want to embed reports and analytics into applications. Embedded BI delivers critical data to business users in the context of their core applications. It also improves the value of customer-facing applications, addressing one of the key points in the market today: customer experience. Register for this special Eckerson Group Webinar to learn about the growing trend of embedding analytics into operational systems. Analyst Wayne Eckerson will offer insights from his firm's years of research into this space. He'll be joined by Gene Arnold of TIBCO Jaspersoft, who will demonstrate how his company enables agility by embedding valuable analytics into operational systems. REGISTER...    See details...
Most BI products now incorporate machine learning to automate manual tasks, such as blending data, crafting queries, generating dashboards, performing root cause analysis, conducting simulations, and finding related reports. Most also run imported predictive models and perform regressions out of the box. The question is, does all this artificial intelligence make BI users more or less productive? Will it increase or decrease their analytical IQ as they become dependent on the machine for answers? Will users trust machine output to their own analysis or that of a trusted colleague? This webcast will explore these and other issues emanating from the new era of Smart Analytics. What You Will Learn: -Will AI enhance BI or create new p...    See details...

Is AI the New BI?

On: November 14, 2018
Thanks to artificial intelligence (AI), prediction and prescription are fast eclipsing description (i.e. business intelligence) as the most sought-after data deliverable. The question now is whether AI will replace BI or complement it. Is there a separate and distinct role for BI apart from AI? Or will the two disciplines merge into an AI-infused analytic application? This webcast will track the emergence of AI and debate its role within the spectrum of analytic capabilities. It will discuss how BI and analytics vendors are ushering in the AI era by incorporating rules and machine learning into BI tools. The goal of AI-infused BI is to automate routine tasks, make self-service easier, and improve data literacy. However, once AI starts co...    See details...
Jim Barksdale, former CEO at Netscape, once said “If we have data let’s look at data. If all we have is opinions, let’s go with mine.” The message is clear, but having data is only the beginning, Your analysts need to find the right data, understand the data, and deliver timely insights at reasonable cost. That’s where data catalogs make a real difference. The value and benefits of a data catalog are often described as the ability for analysts to find the data that they need quickly and efficiently. Data cataloging accelerates analysis by minimizing the time and effort that analysts spend finding and preparing data. Anecdotally it is said that 80% of self-service analysis without a data catalog is spent getting the data ready ...    See details...
Metadata remains the cornerstone of information management. It's not just the glue that holds systems together; it enables practically every transaction in today's data-driven enterprise. From reporting and analysis to governance and security, metadata holds the key to accuracy and context. But manual management of this critical resource can cause serious issues. That's why automated metadata management is now taking center stage. Register for this episode of The Briefing Room to hear veteran Data Scientist, Dr. Geoffrey Malafsky explain why metadata is more critical than ever in the age of big data. He'll be joined by Amnon Drori of Octopai, who will demonstrate how his company's software is revolutionizing the once-unwieldy process of ...    See details...
Recent developments in data management, big data, data lakes, NoSQL, Hadoop and the cloud, enable businesses to gain more insights faster. This new age of data warehousing has a high emphasis on self-service and global access to data, raising new challenges. How can we optimize workloads originated from self-service applications? How can we maintain data warehouse health as it is leveraged by more diverse analytics? How can we ensure scalability without having to increase IT resources? To enable self service and modernize with confidence, modern tools will be needed to increase visibility into workloads, automate health checks and workload tuning, and introduce self-service troubleshooting to the user community. You will learn: -The b...    See details...
The emergence of big data analytics and the adoption of data lake architecture cause many to question the future of data warehousing. Yet recent surveys show that more than 60% of companies are still operating between 2 and 5 data warehouses. Many people have talked about eliminating the data warehouse altogether. But the reality is that the data warehouse offers value that the data lake doesn’t address and — vice versa. The real challenge is how to combine the data warehouse with a data lake architecture, modern data pipelines, and analytics use cases. Join us to learn how automation and agile data engineering step up to the challenges of data warehouse modernization. You Will Learn: -The key challenges of legacy data warehousing ...    See details...
The modern world of data won't fit into a lake, let alone a warehouse. Valuable data sources have proliferated to the point where the concept of a centralized repository for all enterprise data no longer makes much sense. The key today is visibility into -- and across -- vast data sets; combined with the ability to visualize and query disparate data at the speed of business. Register for this episode of The Briefing Room to hear Bloor Group CEO Eric Kavanagh explain how the world of business intelligence has changed, and why a new approach is necessary. He'll be joined by Ruhollah Farchtchi and Mary Flynn of Zoomdata who will demonstrate how their company's platform was designed from the ground up to handle the speed, variety and comple...    See details...
Rome wasn’t built in a day, but its architects understood the need for long-term strategic planning. The same holds true for modern data architectures, which must durably span on-prem data centers and the ever-expanding array of cloud providers. That's no small challenge in a world burgeoning with big data, vexed by global hackers, pressed by expanding regulations, and driven by the demands of both public and private companies in highly competitive markets. Register for this inaugural episode of How Data Works to learn from Hortonworks VP of Marketing and Sales Scott Clinton, who will explain why smart information architecture must underpin the creation and management of an enterprise data fabric. He'll be joined by Bloor Group CEO an...    See details...
Modern data pipelines are more complex than traditional data warehousing pipelines. They must support a multiplicity of data sources, data types, and use cases; they must be designed and deployed in an agile manner; they must support batch, mini-batch, and streaming data flows, and, they must meet service level agreements for data quality, availability, and consistency. Many organizations are turning to data catalogs to support modern data pipelines. Data catalogs can automate the tagging and organization of ingested content and derived data sets, harmonize data flowing through multiple new and existing pipelines, support data curation processes, and enable business users to explore content for analysis. This Webinar will explore the rol...    See details...
Timing really is everything, especially in a highly connected world. As the Internet of Things continues to reshape every industry, today's most forward-thinking organizations are discovering the remarkable power of edge computing. By capturing, analyzing and acting on data at the edge, these innovators are solving business challenges as they arise, and capitalizing on opportunities in near-real-time. The critical success factor is a robust, optimized information architecture. Register for this episode of How Data Works to hear Bloor Group CEO Eric Kavanagh explain how the combination of edge computing and real-time intelligence creates a whole new solution set for organizations to leverage. He'll be joined by Dinesh Chandrasekhar from H...    See details...
The results are in, and businesses around the world concur: data is the key to improving user experience, the #1 concern of senior managers globally. More to the point, data that leads to valuable customer insights is mission critical. However, most companies admit they lack the necessary tools, technologies and best practices to consistently deliver trusted data to stakeholders. What's a data-driven organization to do? Register for this episode of The Briefing Room to learn how today's innovative enterprises orchestrate increasingly powerful technologies and methods for ensuring the delivery of timely, trusted insights. He'll be joined by Erin Haselkorn of Experian, who will share results from a recent survey that canvassed a wide range...    See details...
Architecture matters. That's why today's innovators are taking a hard look at streaming data, an increasingly attractive option that can transform business in several ways: replacing aging data ingestion techniques like ETL; solving long-standing data quality challenges; improving business processes ranging from sales and marketing to logistics and procurement; or any number of activities related to accelerating data warehousing, business intelligence and analytics. Register for this DM Radio Deep Dive Webinar to learn how streaming data can rejuvenate or supplant traditional data management practices. Host Eric Kavanagh will explain how streaming-first architectures can relieve data engineers from time-consuming, error-prone processes, ...    See details...
The transformative power of machine learning cannot be denied, but achieving business value has proven elusive for many organizations. There are several key challenges in the process: a) deciding on the right process to augment; b) determining the optimal algorithms for the use case; c) gathering enough quality data to train the algorithms; and, last but not least, d) operationalizing the results of machine learning. Register for this episode of The Briefing Room to hear Bloor Group CEO Eric Kavanagh explain why machine learning projects must embrace the end-to-end process in order to be effective. He'll be joined by Harish Doddi of Datatron, who will share his company's recent experiences in deploying machine learning modules for sever...    See details...
Popularized by Facebook, Google and LinkedIn, graph technology now enjoys widespread recognition in the field of data analytics and discovery. Unlike relational or column-store databases, graph solutions are optimized for understanding relationships between entities, which greatly facilitates the analysis of vast data sets. In an increasingly connected world, this kind of technology holds tremendous promise for solving complex business challenges in real-time. Register for this episode of The Briefing Room to hear Bloor Group CEO Eric Kavanagh explain how a wave of innovation in graph technology is reshaping the world of analytics at-scale. He'll be joined by Gaurav Deshpande of TigerGraph, who will share details of his company's techni...    See details...
Traditional data warehousing has long used batch jobs to move, load and transform data for decision making. But as data volumes rise and the velocity of business grows, more organizations are opting to move and process data in real-time or near real-time. Batch processing is giving way to mini-batches fueled by replication and change data capture as well as stream processing in which events are captured, processed, and analyzed as they happen. Many companies today have a mix of batch, mini-batch, and stream-based processing. The questions is whether organizations should embrace streaming as the default mode of data acquisition? Several vendors are now pitching streaming-first architectures and extolling the benefits of processing data i...    See details...
The problem with traditional data pipelines based on extract, transform, and load (ETL) tools that populate data warehouses and data marts is that power users quickly bump up against their dimensional boundaries. To answer urgent questions, they are forced to download data from data warehouses into Excel or another desktop tool and combine with data acquired elsewhere. The result is a suboptimal spreadmart. Today, organizations build modern data pipelines to support a variety of use cases. Besides data warehouses, modern data pipelines generate data marts, data science sandboxes, data extracts, data science applications, and various operational systems. These pipelines often support both analytical and operational applications, structure...    See details...
The data lake is a repository of corporate data. So how is that different from a data warehouse? Where does a data lake leave off and data warehouse begin, and vice versa? Are these mutually exclusive environments or long-lost companions in a future data nirvana? Using our point-counterpoint approach, a team of seasoned data experts will debate the relationship between data lakes and data warehouses and come to a consensus opinion. This Socratic approach will help you understand: -How a data lake differs from a data warehouse -What workloads belong in a data lake versus a data warehouse -How the dividing line between the two environments is getting fuzzier -How to build a modern analytics ecosystem that leverages the best of data l...    See details...
As the information economy plows forward, successful organizations are harnessing many more sources of increasingly diverse data. The new center of gravity is cloud, with thousands of web-based applications providing highly focused business value, and generating useful data in the process. As such, the modern data warehouse has evolved dramatically: it's faster, more dynamic, and requires a much more agile solution for ingesting data at-scale. Register for this episode of Synthesis to hear Bloor Group CEO Eric Kavanagh examine the evolving role of data in a cloud-enabled world. He'll be joined by Dewayne Washington of Eckerson Group who will outline ways to accelerate the design and deployment of modern data solutions. And Taylor Brown ...    See details...
The information supply chain now extends around the globe, connecting people and organizations in truly remarkable ways. Today's most forward-looking companies are harnessing diverse data sets to create highly contextualized products and services, while optimizing workflows to expedite production and delivery. By blending a critical mass of real world data, these organizations are benefiting from a new kind of intelligence which can lead to tremendous business value. Register for this inaugural episode of Real World Data to hear Analyst Eric Kavanagh explain why this 'new intelligence' will fundamentally change business as usual. He'll be joined by Helen Arnold, President of the SAP Data Network, who will demonstrate how her company's vi...    See details...
The data warehouse promised to deliver a single version of truth. But skeptics abound, saying a single version of truth is a mirage and not necessary. Yet, how can an organization exist as a holistic entity if it doesn’t have a common set of metrics and data for capturing, analyzing, and interpreting business activity? Using our point-counterpoint approach, a team of seasoned data experts will debate the conceptual and technical feasibility of the single version of truth and come to a consensus opinion. This Socratic approach will help you understand: -Why a single version of truth (SVOT) is still required -When an organization can ignore the SVOT imperative -How to balance SVOT and MVOT (multiple versions of truth) within a singl...    See details...
The knowledge economy is upon us. Fully 80% of companies today are considered knowledge-driven. Measuring and valuing knowledge as intellectual capital, however, is often considered a soft science. Balance sheets of organizations are cost-driven, providing a reflection of the past, but not necessarily of the present or the future. So how can companies quantify the tacit knowledge of their organization? Register for this special episode of Hot Technologies to find out! We will discuss: -How Tacit knowledge can be turned into explicit knowledge -The 4 forms of Intellectual Capital (Human, Customer, Structural, Alliance) -A 10-step approach to turn knowledge into financial assets -How knowledge and intellectual capital can be valued and...    See details...
Data science provides today’s businesses with an unprecedented opportunity to increase revenues and lower costs by leveraging existing data assets. And companies that seamlessly integrate data science into their operations are taking market share from those that don’t. But data science is not a panacea. It is complex and possesses many challenges. Far too often, models created by data scientists are never deployed and powerful models are incorrectly created or misapplied. This can lead to expensive mistakes. To better understand these challenges and their solutions, we conducted more than thirty in-depth interviews with data science experts from a variety of industries. From those interviews, we compiled the top ten challenges and th...    See details...
The traditional data warehouse was not designed to support the variety, volume, and velocity of data commonplace in the era of big data. Nor was it a great place for data scientists and other power users to explore data in an unfettered manner. These new requirements have freed the data warehouse to do what it does best, while at the same time, new technologies, including Hadoop, the cloud, and virtualization services, promise to shake the data warehouse from its traditional relational roots. Using our point-counterpoint approach, a team of seasoned data experts will debate the shape and dimensions of the modern data warehouse and come to a consensus opinion. This Socratic approach will help you understand: -How a data warehouse fits ...    See details...
Many organizations weigh the benefits and risks of data lakes versus data warehouses. The former can be vast but slow, while the latter requires great effort to achieve the desired speed. But these solutions are no longer mutually exclusive. A new kind of supercharged information architecture can deliver the best of both worlds, enabling the richness of data lake designs, along with the performance and accuracy of enterprise data warehousing. Register for this episode of InsideAnalysis to hear Analyst Eric Kavanagh explain why a host of innovations in data processing and management are enabling truly interactive business intelligence. He'll be joined by Kelly Stirman of Dremio, who will demonstrate how his company's platform enables acce...    See details...
The freewheeling days of data management are heading for the history books, as a new era of accountability arises. The modern enterprise must embrace responsible policies to manage the complete data lifecycle, from cradle to grave. That includes the so-called right to be forgotten, a relatively new mandate via the General Data Protection Regulation (GDPR) that has many organizations looking for answers. Register for this DM Radio Deep Dive to hear Analyst Eric Kavanagh explain why companies must employ comprehensive data governance programs that involve all major stakeholders in a meaningful and defensible manner. He’ll be joined by Matt Hayes of Attunity who will explain why effective management of production data is critical for succ...    See details...
Data catalogs are the new standard for delivering trusted data and are proving themselves as state-of-the-art data management tools. From modest beginnings as data inventory and search tools, they have grown to support business analysts, data scientists, data stewards, data curators, and data engineers. Beyond tactical applications, catalogs have strategic value. CDOs and CAOs view the catalog as strategic for data asset management, data governance, and analytic quality and productivity. Selecting a data catalog is a critical decision with far reaching implications. With features to support metadata management, data inventory, data discovery, analysis and insight, and data governance the catalog becomes the centerpiece of modern data man...    See details...

Is Cloud the de facto Data Lake

On: March 26, 2018
First came the data warehouse, then the data lake. Open-source upstarts Cloudera and Hortonworks jumped on the data lake concept quickly, touting their mastery of Apache Hadoop as key to their ability to effectively build and maintain data lakes that provide business value. But a funny thing happened on the way to the digital forum: Cloud storage underwent a major metamorphosis, with major price-performance improvements. Suddenly, a host of cloud storage vendors were challenging the status quo, throwing down the gauntlet to the early-stage leaders in the data lake game. And savvy analyst, Tony Baer of Ovum asked: “Is cloud the de facto data lake?”. Register for this episode of Inside Analysis to learn from the veteran analyst and hos...    See details...
There has been a steady drumbeat in some quarters that the venerable data warehouse is dead, even though it is a fixture at most organizations. Some insist it is too slow and cumbersome to support today’s fast-moving and data hungry organizations while others claim it’s still a critical piece of data infrastructure that companies can’t live without. Somewhere between these extremes lies the truth. Using our point-counterpoint approach, a team of seasoned data experts will debate the fate of the data warehouse in the era of big data and the cloud and come to a consensus opinion. This Socratic approach will help you understand: -What’s ailing the traditional data warehouse and what to do about it -What parts of the data warehou...    See details...
The next revolution has begun. Businesses everywhere have tasted the value of analytics, and there's no turning back. And while data lakes have hit some hurdles, that has mostly resulted from non-technical users not having the tools they need to effectively navigate these massive repositories. That's now changing, as a new wave of technology enables visual analysis and discovery of nearly any type of data, structured or unstructured, static or streaming. Register for this episode of Hot Technologies to hear veteran Analyst Wayne Eckerson explain why this new wave of analytical technology offers great promise. He'll be joined by Steve Wooledge of Arcadia Data, who will showcase his company's BI and visual analytics platform, designed to r...    See details...
Everyone wants more data these days -- to generate revenue, gain competitive advantage, improve efficiency, or achieve any number of other objectives. But as the importance of data governance grows, and organizations realize the need to take data strategy more seriously, there is one question that remains largely unanswered: Who owns the data? Register for this special Webinar to hear data veteran Eric Kavanagh explain why understanding data ownership can play a central role in helping companies improve their information policies and practices. He'll be joined by Erin Haselkorn of Experian, who will share key insights from a recent research program that sought to better understand the critical issues facing data managers today. Attendees...    See details...
While big data platforms grab the headlines, relational databases still run the vast majority of businesses today. They are the foundation of commerce, which is why it pays to keep them running efficiently. With so much pressure coming from inside and outside the firewall, the task of optimizing database performance calls for thorough use of all the available tools and techniques. Register for this episode of Hot Technologies to learn how your company can optimize database performance, and thus keep users happy throughout the enterprise. Gary Smith of IDERA will demonstrate key features of DBArtisan for identifying and addressing database performance issues, along with several useful utilities for general database management. REGISTER...    See details...

Bloor group research

Multimedia Research Program by The Bloor Group What's bigger than big data? Artificial intelligence! That's because AI will soon transform practically every business process in the world, in part due to the abundance of big data, and specifically ...
Data may be the new oil, but every sword cuts both ways. Just as with any valuable asset, there are liabilities associated with data. The key to success in today's world is to ensure that you have sound policies in place for managing data, especi...
Database is the new black. Ever the backbone of information management architectures, database technology continually evolves to meet growing and changing business needs. New types of data and applications make the database more important then ever, ...
A sea change is underway. The old constraints are fading, while a bold new vision unfolds. The Data Lake represents more than another type of information reservoir. It embodies a fundamental change in the data management industry itself. Tools, techn...
This primary research project, conducted by David Loshin and Abie Reifer of DecisionWorx in conjunction with The Bloor Group, will examine the real-world use of technologies and solutions that comprise the Hadoop/Big Data Management enterprise enviro...

Infographics