bizio_ti - stock.adobe.com

Preparing for future AI energy disruption

The explosive growth of AI is set to outstrip the capacity of global power grids, threatening to derail deployments. With datacentres facing power shortages and some countries halting construction, organisations must act now to secure their AI future

The explosive growth of new hyperscale datacentres to implement AI and generative artificial intelligence (GenAI) is creating an insatiable demand for power that will exceed the current ability of power utility providers to provide it.

With datacentre forecasts from Gartner reaching as high as 160% growth over the next two years, this rapidly increasing electricity consumption threatens to disrupt energy availability and lead to power shortages.

Gartner predicts 40% of existing AI datacentres will be operationally constrained by power availability by 2027. This will create a significant challenge for organisations planning to implement these technologies.

This will also have a significant impact on cloud strategies, with price rises due to AI-driven energy consumption in the range of 10 to 100 times today’s prices.

New larger datacentres are being planned to handle the huge amounts of data needed to train and implement the rapidly expanding large language models (LLMs) that underpin GenAI applications.

However, some nations are preventing the construction of datacentres. Datacentre applications in Europe are being denied due to the energy demands exceeding local supply. Some locations, such as Ireland and Singapore, are either severely limiting or completely stopping datacentre growth out of power concerns.

Either way, any new power transmission, distribution and generation capacity could take years to come online and won’t alleviate current problems. Until there is a solution, organisations must prepare now to mitigate this risk.

Make informed decisions for AI deployments

Organisations must factor in the potentially negative impact of energy consumption for their on-premises and cloud-based AI deployments. This will help them to mitigate the risks of business interruption and even business closure due to energy demands exceeding energy supply. 

The potential that advances in semiconductor technology performance and new enterprise architecture topologies will have on increasing energy efficiency must also be considered. While Gartner expects there will be breakthroughs ahead in these areas, the next 36 months will be chaotic as maturing and new entrants to the market will be competing for leadership.

It’s also important for organisations to examine different options on the energy supply side. Current expectations are that energy is in enough supply to meet growing demands, and history has shown that power plants can be upgraded over time to address greater demand, up to a point.

However, the rising area of concern is the copper infrastructure of the electrical grid itself. This has improved very little since the 1960s when much of it was built. It represents a constraint to electrical distribution that can be temporarily overcome by placing datacentres near nuclear power plants.

Amazon Web Services (AWS) recently did this, locating a 750 MW datacentre near a 2.3-gigawatt nuclear power plant in Pennsylvania to ensure the datacentres’ long-term growth wouldn’t suffer energy disruption.

Of course, this currently isn’t a viable option in Australia, as there are no nuclear plants or current plans to build any. The proposed phasing out of coal-fired power plants will, however, require organisations to consider future alternate supply options.

Monetise data and AI services

The inevitable result of impending power shortages is an increase in prices. Organisations should consider offsetting the increasing costs of their growing energy consumption for AI deployments by finding ways to monetise their data and AI services.

Tesla, for example, is attacking the scalability problem by building its Dojo supercomputing facilities to address the needs of vehicles to get to Level 5 autonomous driving – where cars can drive without any human interaction. This goal couldn’t be accomplished because there wasn’t enough power or processing capacity in an individual car. So, it monetised this Level 5 capability via the Dojo facility.

In the future, there won’t be enough onboard power, processing and memory capacity in many applications, or in portable, medical and security devices, for organisations to deliver the level of AI services the market envisioned. Moving this into their own facility or leveraging an external cloud provider could be a great way to monetise their overall vision, while not being constrained by the form factor of the device or platform.

This means new approaches to semiconductor processors optimised for learning and inferencing, and new approaches for chip packaging to improve input/output (I/O) speed and cooling. According to reports from Tesla, the Dojo facility processing video streams from its vehicles in real-time is consuming 30% less energy than traditional state-of-the-art supercomputing.

Other major producers of chips are currently developing and implementing their own facilities to do something roughly similar. The race to reduce power and improve efficiency of AI will produce something quite useful to future generations. However, advances in chip technology alone won’t remove the need to improve AI software engineering, as well as enterprise architecture.

Ensure cloud providers deliver sufficient energy

Before adopting or using cloud infrastructure for AI and GenAI applications, organisations should investigate the cloud provider’s overall energy profile and see the tools showing the current and available capacity in real-time.

Organisations then need to determine what their backup plan is if they run short, especially given some governments are proactively stopping the construction of datacentres due to lack of locally available power.

It is also an important consideration that capacity changes don’t disrupt the environmental, social and governance (ESG) policy for the datacentre. The old mantra used to be: place datacentres anywhere you want and connect the power to them. Now, we need a new one: find the power first; then build data centres nearby.

Nations, states and cities that seek to retain competitive advantage in datacentres will now be investing in new electrical capacity, as well as new infrastructure for distribution.

Jorge Lopez is distinguished vice-president analyst at Gartner

Read more on IT efficiency and sustainability