Go to Top

Accidental Architectures and the Future of Intelligent Networks

Not everything happens for a reason in the world of information management. Not every table or field in a database got where it wound up via some master plan. More often than not, a company’s information architecture has grown and evolved organically, like a sort of digital mycelium, spreading underground for years, ultimately providing the infrastructure for all manner of analytical insights to blossom somewhere down the line.

The obvious casualties of these “accidental architectures” (as companies like EMC and Talend are calling them) are the elusive goals of clarity and certainty. That’s why residential construction engineers take a vastly more disciplined approach when working with their architect counterparts. You wouldn’t want an accidental architecture for your three-story home, would you? No one in their right mind would want any such thing.

And yet these hodgepodge information architectures exist everywhere. You could argue that the Internet itself, in all its glory and splendor, is one gigantic accidental architecture. Oh, sure, there are many decidedly engineered pillars which help sustain the massive construct called the World Wide Web, things like DNS servers, for example. But still, only a fool would assert that the entirety of the Internet has been carefully designed for optimal efficiency.

There are, however, certain undeniable trends which drive the practical implementations of information systems. By no means is the challenge of heterogeneity new. Movements like the Service Oriented Architecture were focused on bringing some rhyme and reason to the otherwise scatterplot world of information and application landscapes. Today, SOA is not widely discussed, but that’s because its guiding principles now largely rule the roost.

One of the most significant disruptions to the status quo some years back was the concept of a data warehouse. Business analysts had realized the hard way that querying the operational systems of the 1980s and ‘90s was simply not effective or efficient. Transactional systems were designed to transact, not analyze. Back in those days, processors were relatively slow, storage was rather expensive, and enterprise software designers were scarce.

Nonetheless, the data warehousing industry was born, and offered great promise. This was conceivably the dawn of the true “information architecture” because now companies were actively extracting data from operational systems, then loading that data into standalone warehouses which were specifically designed to provide a foundation for running reports and enabling analysts to do ad hoc queries.

But a funny thing happened on the way to that coveted 360 degree view of the enterprise. Consultants, vendors and end users alike eventually realized that the data warehouse is more like a mere mortal than the eternal being some had hoped it might be. Turns out that getting all that data consolidated in one place really does take a great deal of time, effort, process, software, hardware and personnel. And all that Extract-Transform-Load (ETL) work! Ugh!

And then there are the politics. Because of its great expense, detailed processes, personnel and network requirements, a data warehouse is a political thing. Getting access to it requires some clout. Managing to get your new data set included in the blessed warehouse takes even more political will. Actually driving the overall direction of the project? That typically takes place high in the chain of corporate command, often involving the CIO.

But the demand for insights throughout any organization has the persistence of water tension: it’s always there, forever pulling. That’s one key reason why the data warehouse appliances burst onto the scene back in the mid-aughts. Netezza (the brainchild of the inimitable Foster Henshaw), Greenplum (a la Luke Lonergan), Vertica (from Dr. Michael Stonebraker) and a variety of other powerful-and-easy-to-deploy solutions began to proliferate.

You can do the math to figure out why: Too many executives got tired of dealing with the strains of the warehouse, and decided to go their own way. Data marts cropped up everywhere, often yielding targeted value for their champions, at the expense of enterprise-wide data quality. The SILO scenario brought us right back to square one. The coveted strategic view of the company suffered another setback.

Enter, the data federation vendors. Pioneers in that space like Composite Software (recently acquired by Cisco, the networking powerhouse — hint, hint) focused on creating what real-time data warehousing visionary Michael Haisten years ago called an “enterprise backplane.” Via data federation, end users could get access to key data sets without disrupting operational systems, and without needing to necessarily connect to the data warehouse.

Formerly referred to as Enterprise Information Integration, this practice hit stride as organizations tried to rein in all those appliance-borne data marts. Upstarts like Denodo Technologies got into the game, and even the venerable Informatica (which built its revenue streams on ETL) ultimately got the federation religion. Other players like Quest with their Toad line of products worked on delivering a virtual switchboard for data management.

Fast forward to today, and another range of innovations has further paved the way for a federated future. The speed of NoSQL engines (DataStax boasts a million writes per second), coupled with parallel processing and multi-core chips, has opened the door to what might just be called a new age in information architectures. This is no small thing.

At the cutting edge of this new era, we find companies like EntepriseWeb and Pneuron. The former has created a fully dynamic, just-in-time data and application fabric, which can be used to create a real-time and flexible information architecture. The latter has developed an exquisite platform for bolstering so-called accidental architectures with as-needed services like data quality, risk management and other staples of IT functionality, all delivered as, when and where they’re needed. EnterpriseWeb’s Dave Duggal calls this kind of approach “late binding.”

And so, the waning of the warehousing era gives rise to the real-time architecture. This has massive implications for how organizations can and should invest in data management software, hardware, personnel and services. As the former co-host of DM Radio, Jim Ericson, used to say, it’ll be “horses for courses” going forward, meaning each company will create its own unique mix of information systems.

Perhaps the best news out of this whole trend is that we no longer need to worry about force-fitting some root-level normalization to our information architectures. Sure, they got here largely by accident, but we can now forego shoving round pegs into square holes. We can start leaving data where it RESTs, call upon it as needed, and focus on getting things done on the business side of the house.

It’s a new dawn. It’s a new day.

 

Eric Kavanagh

About Eric Kavanagh

Eric has more than 20 years of experience as a career journalist with a keen focus on enterprise technologies. He designs and moderates a variety of New Media programs, including The Briefing Room, Information Management’s DM Radio and Espresso Series, as well as GARP’s Leadership and Research Webcasts. His mission is to help people leverage the power of software, methodologies and politics in order to get things done.

Eric Kavanagh

About Eric Kavanagh

Eric has more than 20 years of experience as a career journalist with a keen focus on enterprise technologies. He designs and moderates a variety of New Media programs, including The Briefing Room, Information Management’s DM Radio and Espresso Series, as well as GARP’s Leadership and Research Webcasts. His mission is to help people leverage the power of software, methodologies and politics in order to get things done.

5 Responses to "Accidental Architectures and the Future of Intelligent Networks"

  • Wayne Kurtz
    September 25, 2013 - 8:59 am Reply

    Eric, great post. This is one of the most insightful observations I’ve read lately. I won’t comment on the individual points of your content. But what amazes me, and you evoke it so well, is the parallel between the evolution of information systems and biological evolution. Many evolutionary biologists/chemists now believe that most of the characteristics of life today, especially the human brain, is an “accidental architecture”, though they don’t use that expression. They talk about forces such as “genetic drift” and “natural selection”, but I think the parallel evolution is very interesting. If you or anyone has an interest in the “biological” side of this analogy I would recommend “The Accidental Mind” by David J. Linden.

  • Dave Duggal
    September 30, 2013 - 10:05 am Reply

    Hi Eric,

    Great post. I’d simply suggest that Enterprise Architecture – Object Orientation / Service Orientation – evolved independent of Web-style architecture (REST), which also provides protocols “The Cloud” runs on.

    Enterprise Architecture to date has focused on modular reductionism – risk is managed in components and services, which are encapsulated/static/tightly-coupled. Unfortunately, in this approach, we’ve “can’t see the forest for the trees”. In 2013, it’s clear, as the world is increasingly distributed, dynamic and diverse (The 3Ds I talk about) the enterprise struggles with systemic risk – the cross-cutting concerns (i.e. enterprise-wide security, governance, compliance, version control, as well as personalization and change management). Control and visibility are hard and rely on disciplined IT procedures using complex protocols like WS-Coordination. Focusing on pieces leads to silos, islands of innovation, accidental architecture. Everything is great in isolation, but yet under-performs as a whole.

    You are right about circling back to REST in your close. The Web-style architecture, as described by Roy Fielding in his dissertation on Representational State Transfer (REST) is, an “intentional” architecture – it was designed to provide “affordances” – in other words change/diversity/interoperability was baked-in consciously into the style.

    Abstract from W3C – “The World Wide Web uses relatively simple technologies with sufficient scalability, efficiency and utility that they have resulted in a remarkable information space of interrelated resources, growing across languages, cultures, and media. In an effort to preserve these properties of the information space as the technologies evolve, this architecture document discusses the core design components of the Web. They are identification of resources, representation of resource state, and the protocols that support the interaction between agents and resources in the space. We relate core design components, constraints, and good practices to the principles and properties they support.” http://www.w3.org/TR/webarch/

    Roy Fielding “When I say Hypertext, I mean the simultaneous presentation of information and controls such that the information becomes the affordance through which the user obtains choices and selects actions” http://www.slideshare.net/royfielding/a-little-rest-and-relaxation slide 50

    While I absolutely agree no one could have predicted what it has become, by removing the rigid/brittle structures/thinking of conventional Enterprise Software Architecture, it was designed purposely to support that organic evolution you describe. As Roy Fielding famously espoused “Engineer for Serendipity” http://groups.yahoo.com/neo/groups/rest-discuss/conversations/topics/8343

    With EnterpriseWeb we build on REST as a scalable, stable, diverse information architecture and layer on Software Agents as part of a ‘Smart’ Intermediary that leverage a RESTful infrastructure to provide context-enhanced ‘Services’ based on link relations and metadata. By computing responses in real-time we can personalize them for interaction-context as well as optimize Security, Compliance and Governance for highly-targeted responses. In this way we’ve fused REST and SOA concepts for a horizontal application layer!

    For the record, we never have the luxury working on greenfields. We always integrate with LDAP/Active Directory and other Security protocols, connect with ERP and legacy systems, existing service libraries and APIs. While we defer to ERP as ‘systems of record’ we find that they are often not authoritative, that there are more than one sources of truth and our customers often use us to rapidly develop composite business entities and functions, which we then can push back as ‘smart’ data-driven and adaptable APIs (in that way simply lighter-weight more capable data virtualization), but customers can also build on these APIs in our system to create ‘smart’ applications and processes.

    Today, we’ve got this great horizontal abstractions – Web and Cloud, but we’re using 20-30year old 3-tier application model to re-create ‘same ol’ vertically integrated stacks in the Cloud. Fat VMs, Bloated Images with long-running stateful thread – this is a disaster… an ‘accidental architecture’. It’s time for something new!

    Sorry for the long response, but you got my juices flowing this morning. Thanks for exhorting people to re-think architecture.

    Best,
    Dave

  • MarcH
    September 30, 2013 - 3:16 pm Reply

    Very interesting indeed. The reference to biology and implicitly to the concept of “emergence” has reminded me that, in 1992, when I heard about Internet for the first time, I was studying “system theories” and “cognitive sciences” (systematism, connexionnism…) within a course on information systems, Internet representing exactly an example of the phenomena of “emergence”, as described by Eric at the beginning of his article.
    A conceptual evolution was particularly intriguing in the most advanced of these theories: the disappearance of “the representation”. It seems that the real time reconfigurations of networks and architectures, without preliminary modelling, illustrate one such paradigm. And the notion of “representation” included in REST is not contradictory insofar as these specific “representations” depend on variable queries and on the state of the resources (but I am not an expert of REST as Dave Duggal). Whatever the intention, the result is somehow “accidental” and temporary.

  • Geoffrey Malafsky
    October 1, 2013 - 4:41 pm Reply

    I agree that it is always a pleasure to read well elucidated ideas about new topics. The comments are insightful and interesting. Yet, I want to bring us down from the orbital party room of belief in self-organizing knowledge and technology systems back into to constrained practical world of human beings and enormously costly system development. The idea of a corporate environment running on emergent systems and data correlation for its important decision making is never going to happen. Nor should it ever happen. Before you say yes it will, spend the next few weeks running your personal finances in that manner or arranging your child’s school, sports, and leisure time activities. The short answer: you are fired. Deterministic, controlled, iteratively checked, hoarded data is the right approach. The only issue is scope: we know that allowing this in the small scope of a single department is bad; we want it in the larger scope of the enterprise; we definitely do not want it in the even larger scope of the business field including “my” competitors. Given the growth of laws and regulations requiring accurately compiled data, with the threat of financial and criminal penalties, we will see the predominant role of data being more hoarded, more controlled, more sensitive. Yes, there will be a large very public role for the cheap and easy data, i.e. promiscuous data, which hangs around with everyone, is riddled with errors (aka disease), and is not taken very seriously except for giddy (natural or induced highs) enthusiasts. Please note that Data Science due to its use of “Science” considers uncontrolled data anathema.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>