An era of post Moore's Law computing?

Every now and then Computer Weekly asks the question of whether Moore’s Law is still relevant. Gordon Moore presented his vision of how computing would evolve in an article published in April 1965. IT is almost 60 years since the co-founder of Intel predicted that the number of transistors on an integrated circuit would double every two years with minimal rise in cost. “Shrinking dimensions on an integrated structure makes it possible to operate the structure at higher speed for the same power per unit area,” he said.

These words have driven the industry to build integrated circuits with more and more transistors, which, given the economics outlined in Moore’s article, means that processing power effectively doubles every two years with little or no increase in cost.

This has also led to what historians will eventually refer to as excessive consumption”, with little regard to efficiency and sustainability. If they know they could get twice the compute power every two years, software developers are enticed to build more complex applications, allowing these product to process far greater volumes of data and run increasingly more complex calculations.

A different approach

Moore’s Law predicted the exponential growth of computing. We have reached a stage where it actually is getting more and more expensive to increase computing power at the same rate, that is, if Nvidia CEO, Jensen Huang, is to be believed. His vision is one of accelerated computation, where the general purpose processors that run datacentre servers remain, but the real power comes from graphics processing units (GPUs) that accelerate data processing applications. In effect, the server processors are upgraded at a lower rate compared to the GPUs, as these offer the real performance boost needed for leading edge data processing enterprise applications. It is easy to see why Huang wants accelerated computing to succeed. His company’s datacentre business just posted $20bn in the last quarter, a 154% increase from a year ago. Selling GPUs to datacentre operators is Nvidia’s biggest growth business and one that is set to grow even more as organisations increasingly rely on enterprise applications powered by artificial intelligence (AI).

For instance, earlier this year, Meta’s CEO said that he expects training and running AI systems would require 10x computer capacity each year. This figure is far more than the 2x every two years predicted in Moore’s Law and means, to use the words of Nvidia’s Huang, “computing inflation.”

Clearly not every organisation will require 10x extra compute capacity every year – that would simply make it unaffordable to the vast majority of businesses. However, Computer Weekly does believe the industry needs to reset its relationship with computational power. We should not expect computing power to rise exponentially forever – we need to prepare for a much slower rate of progress.