Apocryphal accepted truth suggests that the term TCO (Total Cost of Ownership) came about somewhere around the end of the 1980s when it was championed (if not coined) by a well known IT analyst firm known for dabbling in magic shapes.
The new birth of IT service
Into the 1990s and we started to commoditise PCs and enter the Pentium processor era. Affordable servers came into the workplace and we saw firms like Compaq enter their heyday as service layers started to add to box sales like never before.
The VAR (or Valued Added Reseller) sprung up to offer additional consultancy (and other incremental and ancillary services) and suddenly firms were being urged to look at TCO and consider the fact that not only did you have to buy the equipment, you also had to think about ongoing operational costs.
OK so that might be a slightly simplistic potted history of TCO, but as an industry term it probably did need to be definitively labeled so that customers would start to think about the bigger picture.
What is really inside TCO?
In its fullest sense or scope, TCO embodies purchasing cost plus:
- servicing and maintenance,
- upgrades and migration project costs,
- incremental additional software and installation costs,
- very importantly — licensing charges and,
- risk assessment costs.
TCO even includes more subtle hidden items such as purchasing research to enable firms to be able to know what equipment to buy in the first place. Then there is a whole list of operational costs from electricity to training to insurance etc.
So what of open source and TCO?
As we know, open source software is free as in speech, but not as in beer. Given the rise and penetration of enterprise level open source and the impact of open standards and open initiatives such as OpenStack in cloud, isn’t it time to stand back and ask what state we find open source TCO in today in 2014?
If we look at the prevalence of open source in computing mission-critical enterprise environments that demand a measured level of robustness (if not bulletproof-ness) then there are a number of approaches. Straight licensing of commercially supported open source software is the first and most obvious route.
Licensing in this scenario could feature the use of open source components within the truly open LAMP stack (Linux, Apache, MySQL, PHP/Perl/Python) — but backed by a commercial license for support, upgrades and training. Or it could be a more straightforward commercially graded version of an essentially open platform, or tool or data management product or single application etc.
So today then COSS (or Commercial Open Source Software) has a role to play in providing firms with lower upfront purchasing costs, but what should always still be reasonable priced 24×7 support for critical applications.
TCO comes full circle
But here’s where the TCO argument comes full circle with open source. The theory states that because the open code base is inherently non-proprietary that even if customers are using static versions of the products, all customers have the ability to feed back into the knowledge pool more directly and improve, enhance and augment the product itself…
… and so the equation is: more software efficiency = greater TCO rewards.
The theory goes further i.e. the money organisation save on unneeded purchase licence fees can be directed making investments in areas that will give genuine benefits to end users and IT.
Abandonment or transition to a new technology?
As stated in the http://eu.conecta.it/ paper entitled “Impact of open source in the total cost of ownership”, we need to consider whether we abandon or transition to new technologies to get to improved TCO effeciencies.
“This [abandonment or transition] stage is overlooked in many cases when TCO is calculated, since the abandonment is not really a part of the real use of a technology. However, information technology systems are usually built to replace existing systems, so that the transition process must be streamlined and easy to achieve…
… Open source software helps in this stage by making it possible to create a temporary code base to ease the transition. It also helps in the data transition itself. However, open source software can be ported and adapted to new architectures, and data can be read and translated to new formats in a clean and portable way. If proprietary systems are involved, data loss or recoding (always an expensive process) may be needed when the only platform that runs the software is no longer supported.”
A further argument for TCO saving comes from the fact that there is open source clarity during troubleshooting — and this means developers can more easily look directly at the code itself. Opponents to this argument say that open source software typically comes with less documentation than proprietary, so it’s an equal trade off. But that was then and this is now, commercial open source is matching proprietary pound of pound and dollar for dollar in many markets.
It’s a perfect world we’re describing here in a sense, but it’s also a real world that has been evidenced by the growth of enterprise open models.
This post for Bloor Group Inside Analysis is written in association with Pentaho. The firm has exerted zero editorial influence over the content presented here and simply seeks to fuel dialogue and discussion based around its mission to provide a cost-effective business analytics platform that fuels growth and innovation.