Inside Analysis

When Moore’s Law: Checking Out or Changing Direction?

Ten nanometers is probably as good as it gets. That marks the end of the road for the physical gate length of transistors on a chip. It’s a mere five years in the future, according to the 2015 International Technology Roadmap for Semiconductors, released in early July.

Should We Care?

Let’s take a 10,000-foot view of the situation. Intel (primarily), but also IBM, ARM and other parties to the CPU game, have business models that have been refined over decades. The model involves being able to introduce new CPUs (at a premium of course), which deliver more power and gradually make previous chips redundant. If you cannot increase the power of a new generation of chips every few years, then this business model collapses.

It partially collapsed in 2004. Up to that time, the CPU guys had been able to improve CPU speed by increasing the clock speed. By 2004, when it reached 3-4 Ghz, it created too much heat for any further increase to be possible. The CPU guys got around this by continuing to miniaturize and adding more processor cores in the extra space this created on the CPU. There were consequences to that. It gradually forced the world into building parallel software. And that is why Hadoop, Spark and pals gained traction.

The likelihood is that whatever the CPU guys do will also disrupt the software world. The question is: What might they do?

Possible Outcomes

Chip designs do not change quickly. The industry learned a long time ago that there is only room for a few standard designs. The cost of building chip fabs is high, so volume matters, and backward compatibility with software is a necessity.

The most likely technical direction is to increase CPU density by going into three dimensions. Think of this as a kind of stacking of silicon, but it’s much more complex than simply piling one wafer on top of another, and you have to be able to cool the chip. Nevertheless, this is not a new idea. The memory industry has already turned in this direction to raise NAND Flash capacity, and the CPU guys will learn from their experience.

Although it would be technically possible to go below 10 nm miniaturization, it’s likely to prove less expensive and offer a more certain future to go into 3D. In theory this approach will behave like the wafers of the past, allowing the CPU guys to pursue the same engineering direction far into the future. And that means they’ll be able to continue their regular “doubling” of power.

Disruptive Times

The comforting picture I just painted may not be how things pan out. There are other disruptive forces in play – new kinds of transistors and memory devices, new processor ideas like neuromorphic computing (massively parallel neural chips), technology convergence with Intel’s move to put an FPGA on chip and AMD’s move to combine the CPU and GPU.

The whole dynamics of the industry have changed. Mobile devices have, by their numbers and power, usurped the desktop, and the Internet of Things will no doubt bring its own distortions as it unfolds. There will be, as there always has been, a need for faster processors, but whether we need large volumes of them is moot. Parallelism now stalks the industry, and it used to live in a high-performance computing monastery. Cloud computing is another wrench in the works. It is presided over by a small number of big players (Amazon, Microsoft, Google) who have the commercial power to design their own CPUs and other components if they so choose. Maybe they will.

An Unsettling Question

The computer industry has proceeded by momentum over decades. During that time we always knew what a computer was. It was personal like a PC or shared like a server. But we do not inhabit that world anymore. The computers are gradually vanishing into our pockets, into devices and into the cloud. Nevertheless, we have the components (CPU, memory, SSD, etc) bequeathed to us by the old world. Perhaps the coming generations of silicon will be built to support this emerging reality. Perhaps they’ll fail if they don’t.

Robin Bloor

About Robin Bloor

Robin is co-founder and Chief Analyst of The Bloor Group. He has more than 30 years of experience in the world of data and information management. He is the creator of the Information-Oriented Architecture, which is to data what the SOA is to services. He is the author of several books including, The Electronic B@zaar, From the Silk Road to the eRoad; a book on e-commerce and three IT books in the Dummies series on SOA, Service Management and The Cloud. He is an international speaker on information management topics. As an analyst for Bloor Research and The Bloor Group, Robin has written scores of white papers, research reports and columns on a wide range of topics from database evaluation to networking options and comparisons to the enterprise in transition.

Robin Bloor

About Robin Bloor

Robin is co-founder and Chief Analyst of The Bloor Group. He has more than 30 years of experience in the world of data and information management. He is the creator of the Information-Oriented Architecture, which is to data what the SOA is to services. He is the author of several books including, The Electronic B@zaar, From the Silk Road to the eRoad; a book on e-commerce and three IT books in the Dummies series on SOA, Service Management and The Cloud. He is an international speaker on information management topics. As an analyst for Bloor Research and The Bloor Group, Robin has written scores of white papers, research reports and columns on a wide range of topics from database evaluation to networking options and comparisons to the enterprise in transition.

Leave a Reply

Your email address will not be published. Required fields are marked *