Go to Top

Can open source future-proof IT architecture?

 This post for Bloor Group Inside Analysis is written in association with Pentaho, a commercial open-source (COSS) provider of reporting, analysis, dashboard, data mining and data integration software. 

Would it be a radical proposition to suggest that an adoption and embrace of open source technology might be a route to future-proofing a firm’s IT architecture?

Twenty or even ten years ago perhaps this notion would be shouted down as irresponsible. But today we only have to look at the widespread “openness” that is inherent to cloud computing to see that with an open approach comes interoperability, innovation and interconnectedness.

That is not to say that those three “I”s are exclusive to open source, but they do open the door to other portals and channels — nothing, of course, is ever as closed or locked in as proprietary.

There is a strong argument to suggest that open standards and the ongoing innovation that emanates from the developer community can help future-proof IT architecture. We postulate this concept based on the fact that developer communities share a variety of compute resources from plug-ins to components to deeper kernel level elements — but the story goes onward from here.

Screen Shot 2014-04-06 at 09.35.35

Open paths to future proofed IT are capable of an innate and intrinsic ability to change — in the locked in proprietary model, the word flexible often applies more to customer payment options that it does to interoperability.

Immeasurable metrics

Going further, open source is a more likely to engender and promote experimentation — and this is one of the key “unknown” factors that works to provide the immeasurable metric of future proofing.

That is to say, how can we future-proof anyway?

We don’t know what is around the corner and so all any firm can ever really do is maximise upon the value of its current IT investment and try and prepare for the inevitable unknown.

Insightful future proofing really comes about from the CIO (or other departmental level IT manager) is capable of understanding and appreciating how employees will work with technology in the future. Is it fair to suggest that there is a clearer path to this insight inside an open tech model? Perhaps yes if we accept the fact that inside an environment of true openness, users themselves will naturally gravitate towards those applications and tools that prove to be:

a)    most productive

b)   most usable

c)    most efficient

d)   most interoperable and

e)    most cross-platform

You could add most cost-effective to that list if you want to wrap the argument around towards the open model, but we will assume that even inside a commercially licenced open framework that price is lower than that found inside the corporate proprietary enterprise shell.

Three open truths

1)   Open paths to future proofed IT are capable of shrugging off the weight of inefficient legacy systems where they exist.

2)   Open paths to future proofed IT are capable of being custom-shaped to current business requirements and those of the future.

3)   Open paths to future proofed IT are (perhaps more than anything) capable of an innate and intrinsic ability to change, completely if necessary — this is arguably the very antithesis of the locked in proprietary model where the word flexible might more often apply to customer payment options that it does to interoperability.

The community speaks

Pentaho’s community leader, Pedro Alves says that the key to future proofing IT architecture starts by accepting the uncomfortable truth that we just can’t predict what’s going to happen. Especially today, when the amount of new technologies rise and fall at a pace that’s impossible for even someone like me, who’s part of the industry, to follow.

According to Alves, “Although we can’t predict the future, we can be prepared to adapt. A global IT architecture touches a lot of different areas and departments, each with their own requirements, constraints and goals. I’m a huge fan of picking the best tool for the job; what works very well for one job, may not work for another. Crucially, even if it’s the best tool today, it might not be down the road.

So in my opinion, the best way to future-proof an IT architecture is to treat each individual component as a service provider, abstracting the technology itself and focusing instead on what it’s doing. If at any point it makes sense to swap the technology, all we need to do is make sure that the same service continues.

This is where open source comes into its own. Compared to proprietary tools, open source software is interoperable by design. It comes with the ability to plug into any type of information source today or in the future, offering the best insurance policy against changes in the dynamic and unpredictable world of IT.”

Let us also remember that (within the context of this current discussion) all the major big data technology is coming out of the open source world from Hadoop to NoSQL and beyond. Nobody knows where it’s heading yet and which players will be the dominant ones — so, therefore, people working with big data technology should be particularly concerned about future-proofing.

The take-away here comes back to one word i.e. open. We could argue that the very nature of open source dictates that we will also indeed have to be open ended about any of the predictions that we make, even when it comes to future proofing.

There is no definitive answer to architectural future proofing, so surely the open approach is the best value insurance factor for today — and of course for tomorrow.

This post for Bloor Group Inside Analysis is written in association with Pentaho. The firm has exerted zero editorial influence over the content presented here and simply seeks to fuel dialogue and discussion based around its mission to provide a cost-effective business analytics platform that fuels growth and innovation.

About Adrian Bridgwater

Adrian Bridgwater is a freelance journalist and corporate content creation specialist focusing on cross platform software application development as well as all related aspects software engineering, project management and technology as a whole. Adrian is a regular writer and blogger with Computer Weekly, Dr Dobbs Journal and others covering the application development landscape to detail the movers, shakers and start-ups that make the industry the vibrant place that it is. His journalistic creed is to bring forward-thinking, impartial, technology editorial to a professional (and hobbyist) software audience around the world. His mission is to objectively inform, educate and challenge - and through this champion better coding capabilities and ultimately better software engineering.

About Adrian Bridgwater

Adrian Bridgwater is a freelance journalist and corporate content creation specialist focusing on cross platform software application development as well as all related aspects software engineering, project management and technology as a whole. Adrian is a regular writer and blogger with Computer Weekly, Dr Dobbs Journal and others covering the application development landscape to detail the movers, shakers and start-ups that make the industry the vibrant place that it is. His journalistic creed is to bring forward-thinking, impartial, technology editorial to a professional (and hobbyist) software audience around the world. His mission is to objectively inform, educate and challenge - and through this champion better coding capabilities and ultimately better software engineering.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>