Liquid Cooling for Data Centers – The Future of Cooling


Liquid cooling in data centers

For data center operators, choosing the correct cooling system for their facility helps them stay ahead of the technological curve. Many providers, however, are reluctant to embrace change, as the data center sector has seen in recent yearsLiquid cooling for data centers is required by those with exceptionally high-density racks, which are typically exceeding 30 kW. Air cooling alone is insufficient to preserve the stability of IT systems in such a high-density environment.

All data center operators can look forward to this future. Rack power requirements are approaching 20kW in many facilities, and many businesses are aiming to implement racks with requirements of 50kW or more as chip density and building costs continue to climb. To improve the efficiency of their facilities, all operators, regardless of size, will soon need to be aware of more effective cooling technologies.

Benefits Of Liquid Cooling For Data Centers

benefits of liquid cooling in data centers

Photo Credit:

There are numerous advantages to using liquid cooling systems, but we’ll focus on four major ones:

Improved Reliability and Performance

Liquid cooling not only provides the desired reliability that all facilities aspire for, but it also improves infrastructure performance. CPU performance is throttled back to avoid thermal runaway when CPU near the boundaries of their acceptable operating temperature, which is quite likely with air cooling. So, even if the hardware isn’t broken, you’re not getting the most out of your infrastructure simply because you don’t have the most efficient cooling solution in place. This is a problem that can be solved with liquid cooling.

Improved Energy Efficiency

When compared to air, liquid has better thermal transmission qualities. When you add in the fact that facilities may remove the fans that were previously needed to circulate air in the data center, you get a huge reduction in energy output. Pumps used in liquid cooling systems use significantly less energy than fans used in traditional cooling technologies.


A movement is presently underway to make data centers more ecologically friendly. As a result, data center operators in drought-stricken areas must seek ways to lower their carbon footprint and use less energy and valuable resources like water. Liquid cooling not only helps facilities cut their energy consumption but also allows them to repurpose captured heat more effectively than air-based cooling systems since liquid-to-liquid heat transfer is more efficient.

Maximizing Data Center Space 

Liquid cooling for data centers allows them to have a density that is impossible to achieve with typical air-cooling technologies. This enables institutions to make better use of data center space while also avoiding the need for additional construction or expansion. Liquid cooling allows operators to build smaller facilities while still being able to accommodate processing-intensive edge deployments, even if physical space is restricted.

Recent Advances In Liquid Cooling Technology

Immersion Cooling 

Data centers have traditionally relied on room air to keep servers and another hardware cold. This strategy, however, isn’t particularly effective. Using a dielectric fluid with an open bath design is the best technique to improve a facility’s cooling efficiency. The use of pressure connectors, vessels, and seals would be eliminated. Green revolution cooling is the name given to this procedure.
The rack is intended specifically for servers to be installed vertically in the rack while using immersion cooling. The immersion tank contains the servers in a dielectric bath and circulates cooling fluid to reduce heat using a CDU.

Cold-Plates Method

By transferring heat to a liquid, which flows to a remote heat exchanger and dissipates into another liquid, cold plates give cooling to infrastructure on a local level. Tubed cold plates are made out of an aluminum plate with a copper tube running through it. This method is very efficient and cost-effective in maintaining low temperatures. To keep the gadget cool, the Coolant tube is put next to the device’s base.


standards in liquid cooling in data centers

Photo Credit:

There are servers built according to the OCP open accelerator interface (OAI) specification, which uses liquid cooling standards. This has a number of advantages for all methods of liquid cooling.
For starters, it means that other vendors may join the party and be confident that their servers will fit into other vendors’ tanks and that users will be able to mix and match the equipment in the long run.
Furthermore, the very existence of a standard should persuade cautious data center operators that it is safe to use – if only because the systems are tested with all potential components that might be used, ensuring that customers can get replacements and refills for a long time.
When it comes to standards, especially in the design of hardware, it’s critical to avoid utilizing materials that could be incompatible with these various dielectric fluids, whether they’re Novec or fluorocarbon, mineral oil, or synthetic oil. That is where OCP is currently making a significant contribution.
OCP will inspect all of the tires, including the safety and compatibility of connectors, as well as the overall physical criteria.
By standardizing and spreading this information, it helps more rapidly enable the market to employ multiple liquid cooling systems.

Pushing The Technology

Vendors of single-phase liquid cooling emphasize the simplicity of their products. Some propellors may be required to move the fluid around in the immersion tanks, although convection is primarily used. There’s no vibration produced by bubbling, so companies like GRC and Asperitas say equipment will live longer.
People talk about immersion in a single stroke and don’t distinguish between single-phase and two-phase. Single-phase is the current immersion cooling approach. While the two-phase allows for increased density, which may be able to go further than present units.

Although server manufacturers are beginning to customize their products to use liquid cooling for data centers, they’ve only made the first stages in reducing unnecessary baggage and bringing things closer together. Beyond that, equipment may be created that would simply not work in a liquid environment.

Hardware design has not caught up to two-phase immersion cooling.  At 3kW per RU, this OAI server is quite intriguing. However, in this tank, we’ve already proved the capacity to cool up to 5.25 kilowatts.

Beyond Measurement

Photo Credit:

The industry’s efficiency metrics are not well-prepared for the arrival of liquid cooling in large quantities. Power usage effectiveness (PUE), is the ratio of IT power to facility power, has been used to assess data center efficiency. However, because liquid cooling simplifies the technology, it compromises how that measurement is made.
According to Uptime, “direct liquid cooling solutions achieve a partial PUE of 1.02 to 1.03, beating the most efficient air-cooling systems by a low single-digit percentage.” However, PUE does not account for the majority of DLC’s energy gains.

Because traditional servers have fans that are powered from the rack, their power is included in the “IT power” section of PUE. They are considered to be part of the data center’s payload.

When liquid cooling replaces those fans, it saves energy and improves efficiency, but it lowers PUE.

However, there is another issue to consider. Even when silicon chips are turned off, they heat up and waste energy owing to leakage currents. This is one of the reasons why data center servers consume nearly the same amount of power when they are idle, a staggering level of waste that goes unnoticed because the PUE calculation ignores it.

Liquid cooling can create a more regulated environment with reduced leakage currents, which is beneficial. With truly reliable cooling tanks, the electronics may potentially be redesigned to take advantage of this, allowing processors to restart their power-efficiency gains.

That’s a positive thing, but it raises the question of how these gains will be quantified. If the promise of widespread DLC adoption materializes, PUE, in its current form, maybe approach the end of its usefulness.

Reducing Water

The low PUE is a significant reason why people are choosing two-phase immersion cooling. It has about double the heat rejection capacity of cold plates or single-phase. However, the fact that liquid cooling for data centers does not consume water may prove to be a larger lure.
When situations call for it, such as when the outside air temperature is too high, data centers with traditional cooling systems may often switch on some evaporative cooling. This entails passing chilled water from the data center through a wet heat exchanger that is cooled by evaporation.

On the other hand, without utilizing water, two-phase cooling may reject heat. This could be a consideration for LiquidStack’s most high-profile client: Microsoft. At Microsoft’s Quincy data center, a LiquidStack cooling system is installed beside an older cooling system.

Microsoft has pledged to lower its water use by 95% by 2024 and to become “water-positive” by 2030, creating more clean water than it uses.

One option to achieve this is to run data centers hotter and use less water for evaporative cooling, although switching workloads to liquid cooling without the use of water could also be beneficial. And the only way to get there is to use high-temperature working fluid technology.

Industry Interest

The surge in hot chips was the first hint of the need for high-performance liquid cooling. The semiconductor activity really began around eight to nine months ago. And that was immediately followed by a high amount of interest and participation from the major hardware manufacturers.

Bitcoin mining continues to eat up a lot of it, and recent attempts in China to tone down the Bitcoin craze have sent some crypto facilities to regions like Texas, where air cooling of mining rigs is simply too hot.

However, there are clear indications that customers outside of the expected markets of HPC and crypto-mining are paying attention.

One thing that’s surprising is the pickup in colocation. We thought colocation was going to be a laggard market for immersion cooling, as traditional colos are not really driving the hardware specification. But we’ve actually now seen a number of projects where colos are aiming to use immersion cooling technology for HPC applications.

It is a surprise when some are deploying two-phase immersion cooling in self-built data centers and colocation sites which tells me that hyperscalers are looking to move to the market, maybe even faster than what we anticipated.

Edge Cases

Another area where micro-facilities are projected to serve data near to applications is the Edge. Liquid cooling wins because it allows for compact systems that don’t require air conditioning.

By 2025, a significant amount of data will be generated at the edge. Compression is becoming more critical as micro data centers and Edge data centers proliferate. Single-phase cooling should suffice in this situation, but two-phase obviously has the advantage.

With a single-phase immersion system, you need a fairly large tank because you’re moving the dielectric fluid around. With a two-phase immersion system, you can actually place the server boards within two and a half millimeters of one another.

How Far Will This Go?

It’s evident that liquid cooling will become more prevalent, but how far will it spread over the globe? The short answer is that technology and chipsets will dictate how quickly the market transitions from air to liquid cooling.

Another consideration is whether the technology will be installed in new buildings or retrofitted into existing data centers, because a liquid-cooled system, whether single-phase or two-phase, will be heavier than its air-cooled counterpart.

It’s possible that older data centers were not designed to sustain a high number of immersion tanks.

If you have a three-story data center and your second and third floors were designed for 250 pounds per square foot of floor loading, it could be difficult to deploy immersion cooling on all three floors

However, because you can dramatically increase the quantity of electricity per tank, you may not need those second and third stories. You might be able to achieve what you would have done on three or four storeys with the air conditioning on your ground floor slab.

Some data centers may eventually have liquid cooling on the concrete slab base of the ground floor, with any remaining air-cooled systems in the upper floors.

New buildings, however, may be built with liquid cooling in mind.

Increased awareness of data center water use could hasten adoption. If additional hyperscalers come out with ambitious water reduction targets like Microsoft has, then liquid cooling adoption will speed even quicker.

If water cooling captures a major piece of the market, say 20%, it will trigger “an unprecedented transformation. “It’s difficult to estimate whether that horizon will be upon us in five or ten years, but if water scarcity and increasing chip power continue to emerge as trends, we’ll see liquid cooling in more than half of data centers.

Get Your Data Center Liquid Cooling Ready With AKCP

Liquid cooling in a data center, on the other hand, does not end with the installation of the system. The more crucial one occurs at the next stage of the procedure. The operators should do consistent monitoring when the design is completed. Data can be retrieved utilizing sensors put throughout the system and facility. Operators can use this to detect potentially dangerous situations ahead of time. A liquid-cooled data center can never go wrong with a dependable monitoring solution like AKCP.

AKCP Monitoring Solutions offers monitoring and alarms for temperatures, flow, pressures, and leak detection, as well as the ability to report into data center management software suites.

Temperature Sensors

A wireless temperature sensor or K-type thermocouple is used to record and measure the temperatures of the server surface, liquid bath, and intake and exit coolant inside the coils.

Power Monitoring Sensor



A power meter that can monitor and record real-time power usage must be used to monitor the cooling unit’s power. The AKCP Power Monitor Sensor provides critical data and allows you to remotely monitor power, obviating the need for manual power audits and delivering instant notifications to possible problems. SensorProbe+ and AKCPro Server live PUE calculations may also be utilized with power meter readings to assess the efficiency of power use in your data center. The built-in graphing tool may be used to display data gathered over time using the Power Monitor sensor.

Wireless Pipe Pressure Monitoring

The pressure in the tank was monitored by an automatic pressure relief valve with a pressure sensor. Digital pressure gauge for monitoring all kinds of liquids and gasses. Remote monitoring via the internet, alerts, and alarms when pressures are out of pre-defined parameters. Upgrade existing analog gauges.

Wireless Valve Motor Control

Wireless, remote monitoring, and control of motorized ball valves in your water distribution network. Check status and remotely actuate the valves. Receive alerts when valves open and close, or automate the valve based on other sensor inputs, such as pressure gauges or flow meters.

Reference Links:
AKCPLiquid Cooling for Data Centers – The Future of Cooling