It’s going to take AI to power AI

This is a guest post by Robert Linsdell, managing director for Australia and New Zealand at critical infrastructure company Vertiv

We’re living in an ever-evolving digital world, one which is increasingly led by technology giants. Governments across Asia-Pacific are chasing their tails to keep up with the speed of innovation, and artificial intelliggence (AI) is the latest technology boom reminding us to expect the unexpected.

Businesses thought they had seen their appetite for data consumption and storage peak. Even after the pandemic – when cloud adoption surged, gaming and streaming took off, and video conference calls chewed the network – our hunger for data expanded once again.

The latest AI systems rolling out – particularly in the generative space – are far more computer intensive than older versions of the technology. This latest AI boom is set to weigh on the digital infrastructure into which we’ve spent the better part of the last three years building resilience and scalability.

According to Tirias Research, by 2028 an additional 4,250 MW of datacentre power will be required to drive these emerging AI systems. AI appears to be rewriting the rulebook on data demand.

And if this is not daunting enough, the architecture of the datacentre will likely change, and we’ll certainly see latency requirements force users to have more edge data centres nearby.

We’re going to see the industry ramp up, again. And as it resets to keep pace with AI, there is a golden opportunity to apply the very same technology causing the reset to datacentres themselves.

Nvidia, once known for its chips used in games, is already doing this. It pivoted to the datacentre market during the pandemic, and consequently, its previous financial year saw datacentre chips account for more than 50% of its revenue as it’s rocketed to becoming a near-trillion-dollar company at the centre of AI’s power. The company argues AI in the datacentre can help reduce energy consumption, and run workloads about 25 times more efficiently.

Energy optimisation is pertinent here – datacentres are the essence of our burgeoning digital world and since the pandemic, we’ve seen the industry in the firing line for its growing percentage of grid energy used.

Datacentre operators spend significant effort working to balance running a datacentre at optimum efficiency, while maintaining resilience. However, the analytics are large, the implications for failure are catastrophic, and it is all for a small, yet continuous saving.

As AI becomes more common in the datacentre, we expect to see it capture real saving – savings which considerably stack up over the years – industry-wide.

Picture this – by analysing vast amounts of real-time energy usage data and monitoring environmental factors, AI will be able to make fine-tuned adjustments to primary power and cooling systems. In essence, effective energy optimisation is created by the infrastructure operating efficiently, and at no risk to the operation itself.

For datacentre technicians, instead of sweating over tempering systems by hand, this would mean they are granted peace of mind with AI effectively keeping the now more energy-efficient lights on.

Another huge opportunity we expect to see with AI in the datacentre is mitigating resource congestion through predictive surge period analysis. Capacity planning is a productivity challenge weighing on the industry today – for a human alone, it requires heavy data crunching and forecasting.

AI can help take away guesswork, analysing patterns to make accurate predictions about future capacity needs. In a peak, this would help datacentres ensure they have sufficient resources to meet customer needs, while in a trough, it would diminish over-investing in capacity that may go unused.

Predictive capabilities also prove useful in maintenance measures. AI-powered anomaly detection will be able to diagnose problems in the system immediately, alerting technicians to potential failures, spurring action before equipment may go offline. The result would be improved efficiency, increased reliability, and reduced risk of costly downtime.

In further benefits, security is now a top concern for any organisation running and using digital services. Today, that is just about everyone, and it is no secret cyber criminals are continuing to use AI to sharpen their tools.

AI in the datacentre has the ability to act as a pair of eyes, keeping a keen watch on every aspect of the facility to detect and prevent threats. Analysing data from sources such as online access logs and network traffic would allow AI systems to watch for and alert organisations to cyber breaches in seconds. Further, we’re heading in the direction where AI-powered sensors could apply human temperature checks and facial recognition to monitor for physical intrusions.

Ultimately, AI will have the opportunity to tune datacentres to operate like well-oiled machines, making sure all components work in harmony to deliver the highest level of performance in our AI-hungry world – a world pressurised by a cost-of-energy crisis and expanding cyber security threats.

While the reality is more nuanced, put plainly, it is going to take AI to power AI.

In fact, Gartner estimates that half of all cloud datacentres will use AI by 2025. It’s going to be a productive couple of years for industry developing one of the fastest-growing technologies, rolling it out, and doing so in a way that ensures trust.

Although the technology is creating requirement for more hard-working datacentres, simultaneously, it is set to make the very infrastructure AI needs to operate smarter, more efficient, more reliable, and more secure.