IT Sustainability Think Tank: How cloud AI technologies can help enterprises go green
The hype around AI is increasingly being matched with discussions about how the technology's adoption will affect the environment, so what can IT leaders do to ensure they keep the companies they work for on the forefront of innovation, without compromising the environment - or their firm's own corporate sustainability agenda?
The UK government has sought to be proactive in its approach to addressing climate change, committing to ambitious targets such as reducing greenhouse gas emissions by at least 68% by 2030 compared to 1990 levels and achieving net zero by 2050.
In support of these goals, obligations will come to the fore as organisations develop and consume artificial intelligence (AI) services in partnership with cloud providers, who currently host most AI technology solutions.
While cloud AI can accelerate the delivery of AI capabilities, the demands particularly from generative AI (GenAI) can consume vast amounts of energy and other resources. So, in addition to assessing cloud providers on their AI capabilities, their sustainability posture is a top priority when making a selection.
The baseline is to partner only with cloud providers that have a demonstrated commitment to sustainability – but that is just the start. In deploying GenAI applications, there are a range of sustainability and optimisation best practices that can mitigate many of the environmental impacts of using cloud AI platforms.
Ramping up renewable power use
Where possible only use cloud-sourced GenAI services that are powered by renewable energy. Sustainable cloud providers share renewable energy statistics for each cloud region and specific cloud datacentres. Beware of greenwashing that obfuscates the sources of energy powering cloud datacentres, though.
Some cloud providers achieve their “100% renewable energy” goals through the use of renewable energy certificates. Only use these cloud datacentres as a fallback option, not as a primary solution to accessing renewable energy.
Minimise energy consumption
Energy-aware workload placement and job scheduling can ensure that cloud AI workloads are running in datacentres most likely to operate sustainably. Also, select cloud data entres that use energy efficiently.
Check the power usage effectiveness (PUE) rating for the cloud services generally and the cloud datacentres specifically. PUE ratings should be as close to 1.0 as possible. Most cloud datacentres have PUE ratings between 1.1 and 1.5.
Right-size for cloud resource consumption
Drive better cloud optimisation through monitoring and reporting service usage; controlling unauthorised or unintended use; right-sizing cloud resources; and scaling up and down resources as required. Using GenAI techniques such as API-based access to large language models (LLMs) are also an effective optimisation measure.
However, be aware that the improved accessibility and affordable cost of AI, and more specifically GenAI, could lead to overuse. Optimisation of cloud resources must include governance policies to manage the prudent and responsible use of GenAI technologies.
Adopt AI-optimised hardware
Cloud providers increasingly use specialised hardware for AI workloads that are often energy-optimised for AI as well. These include NVIDIA’s DGX systems, Amazon Web Services’ (AWS) Trainium and Inferentia processors, Google’s Tensor Processing Units and the recently announced Microsoft Azure Maia chipsets.
Leading cloud providers offer this special-purpose hardware as designated AI instance types, which when selected can provide many benefits, including improved price and performance, and lower energy consumption.
Manage data storage accordingly
Data storage is inexpensive and easy to use, which has led to a broad proliferation and replication of data. While deleting unneeded data will have some sustainability benefits, the active use and management of data could consume unnecessary energy.
A sustainable strategy includes eliminating data that holds no value for the organisation and selecting the most efficient types of storage technology for different forms of data. It also includes using offline storage when possible; implementing data governance policies; storing data close to the applications and processes accessing it; and minimising data replication.
Minimise data movement
Increased network usage by moving large sets of data between datacentres can increase energy consumption. The best approach to minimise this is to collate AI-supporting data with the AI modelling processes and applications.
Cross-cloud networking capabilities, including the use of cloud provider networking services, can minimise data movement. When data has to be transported across networks, the most sustainable approach is to ensure only the necessary data is sent.
Establish sustainable application architectures
Applications will increasingly incorporate GenAI capabilities, but not without the potential risk of introducing inefficiencies. When designing applications, be aware that AI inference activities will use more energy than AI model training.
To optimise AI inference, use smaller inference models to reduce the memory footprint, leverage inference-optimised hardware and accelerators, and use distributed access points to place AI models closer to the point of application consumption.
Adopt out of hours IT
GenAI processes are energy-intensive, so in addition to clean energy sources consider overall energy availability. Some geographic regions are already energy-constrained and adding additional, energy-intensive operations may overburden those systems.
Cloud datacentres may have greater access to less expensive, green energy during off-peak times. Employ AI workload placement strategies to take advantage of energy availability in different locations.
Tuning up AI models
Cloud providers are currently investing billions in foundational Large Language Models (LLMs) to support general-purpose GenAI solutions. While some LLMs may need to be built from scratch, first aim to use these foundational LLMs.
Using techniques such as fine-tuning can help achieve higher accuracy using existing LLMs. Other techniques, such as prompt engineering, may also produce improved resource utilisation while requiring less energy.
Read more from the IT Sustainability Think Tank
- The conversation around IT sustainability stepped up a gear in 2023, but has it been another year of all talk and no action?
- There is an ever-growing list of rules and regulations for enterprises to get their heads around when it comes to sustainability, but what can they do to keep on top of things?
- As 2023 draws to a close, it is time to look ahead at what enterprise leaders should be focusing their IT sustainability efforts on next year
- CIOs and IT leaders must take action now to ensure they have an accurate overview of the carbon emissions, energy use and sustainability of their organisation's overall activities. But how should they go about this?