WavebreakMediaMicro - stock.adob

Datacentre liquid cooling: What needs to happen for it to become commonplace in colocation?

As demand for increasingly dense racks and compute-heavy workloads increases within datacentres, could this lead to a wholesale shift in preference from air-to-liquid-based cooling systems? If so, at what cost to the environment?

The deployment of liquid cooling in datacentres is far from a novel concept, but the use cases for the technology remain niche, and largely limited to exascale and high-performance computing (HPC) environments.

Fresh air cooling remains the preferred (and most prevalent) means of keeping the temperatures of datacentres across the world in check, but that may not always be the case, says Paul Finch, CEO and chief operating officer of Essex-based colocation facility Kao Data.

The company has set itself a goal of becoming the leading provider of datacentre capacity to the UK’s life science community. It also claims to be the first wholesale colocation facility in the UK to make use of indirect evaporative cooling without mechanical refrigeration, whereby the hot air generated by the servers is released into the atmosphere, rather than outside air being pumped in to keep the equipment cool.

Even so, its flagship, multi-tenanted facility in Harlow is designed to accommodate a future whereby liquid cooling could become more commonplace within the colocation space. The way Finch sees it, developing hybrid facilities – containing a mix of air and liquid cooling systems – is a good way of laying the groundwork for that.

“What is certain is that liquid-cooled processor technology has reached a stage where if you are in the datacentre colocation market, and not planning for your facility to implement liquid-cooled compute capability, then you are way behind the curve,” he says.

The physical space requirements and layout of a facility that is liquid cooled requires a different type of datacentre design than an air-cooled facility.

“Datacentre buildings evolve, and if liquid cooling is the next step, then there are considerations that must have been taken prior to the construction stage, such as structural floor loading, ceiling heights and logistic routes, including lift design,” adds Finch.

Colocation providers that fail to take all of the above into account when planning out their facilities will struggle to accommodate liquid cooling-requiring workloads, which – in turn – could put them at a competitive disadvantage later down the line, he says.

“Without this flexible building design, this radical infrastructure change will mean legacy datacentres at best will struggle to remain competitive, and at worst discount themselves at the customer proposal stage,” says Finch.

This is particularly as various parties in the datacentre industry are of the view that the number of compute-heavy workloads that run in traditional colocation facilities is set to increase, which will require more efficient and targeted cooling systems than traditional air-based systems provide.

As an example, Finch cites the growing demand for denser and high-performing compute capacity in datacentres to support deep learning and artificial intelligence (AI) workloads, fuelled by advancements in processor technology.

“The technology to process the massive datasets required for real-time or near real-time applications means these chips are processing billions of instruction operations per second [IOPS], which generates heat that must be removed in the most efficient process possible,” he says.

“Liquids offer a far higher heat capacity than air, [given] water is 4,000 times more efficient than air. With rack density estimates north of 50kW, this technology offers opportunities to achieve increased energy efficiency across the datacentre compared with air-cooled sites.”

This is perhaps why the operators of hyperscale datacentres are said to be among the user groups that are most actively exploring the use of liquid cooling in their facilities, as it could enable them to cram much more kit into a smaller amount of space.

This is also why the roll-out of 5G connectivity, and its reliance on edge-compute environments, is often cited as another potential use case, says David Craig, CEO of liquid cooling systems manufacturer Iceotope Technologies.

“5G can only be enabled by adopting edge compute, and with the anticipated volume of infrastructure expected to be high, we have to ensure those datacentres are designed and cooled efficiently from the start, or risk negative impacts on the environment,” he says.

Relying on air-based cooling systems in edge deployments within built-up, urban areas can lead to issues whereby poor air quality compromises efforts to keep facilities cool, says Craig.

“As we move forward, air cooling will be inadequate for high-density multicore-processor systems. There are a number of environmental concerns driving adoption already, including poor air quality and the overuse of water in urban environments, but overall, I expect hyperscale datacentres to be the next adoption space for liquid cooling,” he says.

“Next will come the industrial edge computing and urbanised edge applications, followed eventually – once the technology is mature, proven and easily accessible – by telcos, enterprise IT and colocation customers.”

Plotting the shift from air to liquid

Not everyone, however, is convinced that a wholesale shift in preference from air cooling to liquid cooling in datacentres is necessarily on the horizon, with independent datacentre consultant Mark Acton saying such a change would be decades in the making.

“My feeling is liquid cooling is a solution looking for a problem at the moment, but if we’re talking longer term, things may change. People ask me, ‘What does the datacentre of 2025 or 2030 look like?’, and my typical response is, ‘Exactly the same as it does now’.

“My feeling is liquid cooling is a solution looking for a problem at the moment, but if we’re talking longer term, things may change”
Mark Acton, independent datacentre consultant

“The datacentres we’re building today have a 20- to 30-year lifespan. So [those who think we’re going to have] completely different datacentres in 2030 are probably not thinking everything through. Yes, we might be building something slightly different by then, but we’ll certainly be using the existing stuff.”

That includes air-based cooling systems, until the density of compute within each rack increases to such an extent that liquid cooling starts to make better economic sense, he adds.

“As we get more dense with compute, which is a continuing trend, there will come a point where we cross a threshold and liquid will become more economic and more viable, but to reach that we need to have far more dense compute environments,” he says.

“You know the number crunching-type systems where they’re operating 40-50kW a cabinet. When you’re genuinely getting to that sort of scale, it starts to make more sense.

“If you’ve just got one or two racks, you could still cool that with air, but if you’ve got a deployment of scale with that sort of density, liquid becomes a much more viable option.”

The environmental viability of liquid cooling

But what are the environmental implications if liquid cooling does become more prevalent in datacentres worldwide? It could be a positive move, claim some.

Mark Seymour, chief technology officer (CTO) at London-based datacentre engineering simulation software provider Future Facilities, says one of the reasons why liquid cooling is so popular in HPC environments is that it is far more energy efficient than traditional air-cooling setups.   

“This is primarily because the high-power densities are challenging, if not impossible, to cool with air-cooled systems,” he tells Computer Weekly. “The majority of these installations report that the energy consumption is much lower than it would have been using air. This also results in both reduced operational costs and a consequent environmental benefit.”

At the same time, water is much better than air for transporting waste heat away from datacentres, which opens up new opportunities for this by-product of their operations to be re-used, he says.

The warm air that is ejected from datacentres through cooling systems is, particularly in the UK, usually released into the air because of the difficulties operators face when trying to re-use it.

While many Nordic countries have district heating systems that can make use of lightly warmed, piped-in air from datacentres to heat homes and offices, the UK does not.

This warmed air is usually generated in huge amounts, which makes it costly and cumbersome to transport long distances, so datacentres effectively need a suitable receptacle for it on their doorsteps to put it to use.

Heated water, however, is relatively easier to transport, meaning liquid cooling opens up the potential for waste heat from datacentres to be re-used, while potentially cutting the amount of energy needed to heat nearby homes and offices accordingly, says Seymour.

Making the economics of liquid cooling work

However, it is “incredibly tough” at the moment for datacentre operators that are not running compute-heavy workloads in their facilities to justify the costs and complexities involved in deploying liquid cooling systems.

This is particularly in the face of the work that the hyperscale cloud and internet giants, such as Facebook and Google, are doing to bolster the energy efficiency of their air-cooled systems, resulting in power usage effectiveness (PUE) scores of sub 1.1, adds Seymour.

“Taken at face value, that means the potential saving is less than 10% for a perfect system. This makes it incredibly tough for users without high-power densities to justify the perceived additional complexity and capital cost associated with liquid cooling,” he says.

“With the better transport properties of liquid and the ability to keep chips cooler even when power densities are higher, the datacentre industry has the opportunity to make a significant contribution to saving the planet by reduced impact on the environment – we just need to hurdle the barriers to liquid cooling becoming mainstream.” 

In HPC environments, however, the environmental benefits of liquid cooling are easier to quantify, says Chris Roberts, head of datacentre and cloud at the Goonhilly Earth Station in Cornwall.

The site is home to a datacentre powered by renewable energy. It is marketed as a collaboration and data processing hub for enterprise and academic organisations focused on creating machine learning and AI services.

The facility predominately makes use of free air cooling, but also has a relatively small-scale Submer liquid immersion cooling deployment on-site to support some of its clients’ HPC workloads.

This setup involves submerging the datacentre servers in a thermally, but not electrically, conductive dielectric coolant, and any heat generated is removed through the deployment of cool heat exchangers.

“It reduces the cost of cooling, and because we’ve only been ready to run it a few months now, the data suggests it is making our cooling 45% more efficient,” he says.

“It’s reducing the amount of power we consume by increasing the efficiency of the cooling, which is obviously still a significant factor in the electricity costs.”

This is important, as the energy consumption habits of the datacentre sector are increasingly a front-of-mind issue for environmental lobbyists, as well as IT sustainability researchers and campaign groups, including the United Nations-backed International Telecommunications Union (ITU).

The datacentre sector’s energy usage habits come under scrutiny in the ITU’s Frontier technologies to protect the environment and tackle climate change report, which also features input from several other UN-related groups as well.

“Datacentres are consuming vast amounts of electricity and are leaving significant environmental footprints. Unless these datacentres begin to move towards utilising renewable energy sources in their operations, they will remain responsible for a significant portion of global emissions”
ITU report

“Datacentres are consuming vast amounts of electricity and are leaving significant environmental footprints,” the report states. “Unless these datacentres begin to move towards utilising renewable energy sources in their operations, they will remain responsible for a significant portion of global emissions.”

And if the deployment of liquid cooling technologies brings about energy efficiency improvements in datacentres, if its use becomes more prevalent, that can only be a good thing.

The ITU is currently in the throes of weighing up the environmental pros and cons of liquid cooling, as part of its wider commitment to ensuring there are guidelines in place to ensure sustainability is a key consideration in the adoption and development of any new and emerging technologies.

In the case of liquid cooling, the organisation is working on the creation of standards to ensure that, if adoption of the technology were to take off, the IT industry would put it to use in an environmentally sustainable way, ITU counsellor Cristina Bueti tells Computer Weekly.  

“The problem we face is that with traditional datacentres, how the datacentre is built, the way the datacentre is installed, and most importantly, how it is maintained is very clear, but the same cannot be says for [datacentres builds featuring] liquid cooling,” she says.

“What we’re trying to do is to put in place a set of specifications, which proves that that there is a possibility for liquid cooling to be considered as an alternative to the traditional [air-cooled] datacentre.

“Water re-use could be an additional value that we can put on the table for customers that consider the environmental impacts to be a key requirement [of any datacentre build].”

These standards and specifications are still some way off, and it is difficult to say with any degree of accuracy now what they will contain, but Bueti is hopeful that a first draft will be available by the end of the year.

Even so, the general consensus across the datacentre industry is that widespread adoption of liquid cooling is still some way off, but – in the meantime – Kao Data’s Finch is of the view that many facilities will probably be operating facilities that are cooled with a mix of air and liquid for some time to come.

This is particularly as there new air-cooled server products coming to market now with 40-to-50kW load capabilities, which mean the technology is unlikely to be swapped out in favour of liquid-based alternatives in a hurry.

“For most enterprise customers of datacentres, there is an opportunity to investigate liquid-cooling processing. But when considering the broader information technology equipment architectures – comprising of compute, memory, storage and networking – we are likely to see the deployment of far more complex mechanical engineering infrastructures adopting a hybrid strategy capable of supporting both air-cooled and water-cooled information technology equipment,” he says.

Read more about liquid cooling in datacentres

  • With machine learning fuelling demand for graphics processing units in the datacentre, Schneider Electric CTO claims this will require a drastic rethink of how operators manage their sites, paving the way for liquid cooling to hit mainstream adoption.
  • As datacentres across the globe continue to grow, the IT industry is under ever more urgent pressure to devise and promote more sustainable ways to satisfy organisations’ ever-increasing demands for processing power.

Next Steps

Liquid cooling vs. air cooling in the data center

Read more on Datacentre cooling infrastructure