shyshka - Fotolia

shyshka - Fotolia

Liquid immersion cooling relief for ultra-dense data centers

  • 1
  • 62

Direct liquid immersion cools servers and paves the way for the next generation of ultra-dense systems. It's not air, and it's not water -- the future is oil.

This article can also be found in the Premium Editorial Download: Modern Infrastructure: Liquid immersion cooling surfaces in the server market:

As data centers get denser, they get hotter. Liquid immersion cooling could replace conventional air cooling methods for greater server and data center density.

Modern servers use a multitude of fans to direct greater volumes of air toward hot components along carefully ducted routes within each enclosure. But fans increase system power consumption, generate significant noise and raise the potential for system problems due to fan failures. While modern processor designs spread the work over larger numbers of low-power processors to mitigate heat, it's clear that air cooling effectively limits the potential density of processors and systems.

Do you need help with data center efficiency?

Tell us a bit about your current data center initiatives and download our guide to building an optimal data center cooling system.

Is your organization undertaking a data center project in the next 12 months?

Designers are hoping to overcome the limitations of air cooling by directly submerging servers or other systems in a liquid. This direct immersion cooling is an emerging technology poised to revolutionize server and data center design, but there are important tradeoffs to consider. Let's take a closer look at how the immersion cooling data center works, the pros and cons, and the requirements needed to support it.

One big heat sink

Liquids provide an attractive cooling medium because they are much denser than air, and denser mediums generally conduct heat energy better. Circulated chilled water has been a staple of cooling for years in some data center heat exchangers, traveling through cabinet doors to remove heat from internal rack spaces, or moving through metal heat sinks to cool processors directly.

But water and electricity don't mix. Water is both conductive and corrosive, and breaches in the water loop can be devastating to systems and facilities. The thought of water in the data center has given major operators a moment of pause.

"We have had cooling events in some of our regional data centers that resulted in impacts to services," said Tim Noble, IT director and advisory board member from ReachIPS Inc., a cloud IT services provider based in Cupertino, Calif. "Removing liquid from the environment prevents the risks from flooding or leaks."

The trick to immersion cooling lies in the choice of liquid. Ordinary water is replaced with a non-conductive and non-corrosive liquid, such as mineral oil, an engineered fluid like 3M's Novec or a blend of different mineral oils such as Green Revolution Cooling's GreenDEF. A suitable fluid allows direct immersion of hot components (and entire systems) for more effective cooling without damaging components or altering the electromagnetic characteristics of sensitive electronic circuits.

There are basically two approaches to immersion liquid cooling: simple chilling and two-phase cooling. Simple chilling is just as the name implies, where servers are completely immersed in a bath of cool liquid. The heat from processors, memory, hard drives and other devices is conducted very effectively into the liquid, which is circulated using an ordinary chiller or other heat exchanger to shed heat and maintain the liquid's temperature. Immersion systems like the CarnotJet from Green Revolution Cooling use this simple and effective process. Some systems circulate coolant through individually enclosed blades rather than immersing the entire rack. One such vendor, LiquidCool Solutions, plumbs the array of enclosed server modules into a common circulation loop.

By comparison, immersion systems like the Immersion-2 from Allied Control employ a two-phase liquid cooling approach. Servers and other hardware reside in a liquid bath, but the preferred non-conductive, non-corrosive liquid has a much lower boiling point -- usually close to 49° Celsius (about 120° Fahrenheit) -- than water or even other immersion media. Heat from the server's processors and other components actually boils the liquid, creating vapor that carries away heat and collects against a chilled coil or other condenser for collection and reuse.

The purported advantage to two-phase cooling is efficiency. First, the liquid does not need to be pumped; circulation is passive and cooler liquid is pulled up as the heated vapor rises. This means no chiller pump is needed to move the mass of immersion fluid, which remains in a containment tank. Heat is removed from the liquid as the vapor condenses on the local condenser, which is cooled through any simple chilled water loop. This consumes far less energy than other designs, yet the liquid-vapor-liquid phase change cycle transfers tremendous amounts of heat energy.

Watching the tide roll

Liquid immersion cooling promises several important benefits for next-generation data centers. The effectiveness of liquid cooling means that server density can far surpass the densities achieved in air-cooled data centers -- without the risk of hot spots caused by uneven, poorly designed or blocked airflow patterns through conventional racks. While current servers are compatible with liquid immersion cooling, the technology has tremendous potential for high-performance computing (HPC) environments that deploy large concentrations of high-end servers -- 30 kW to 100 kW per rack -- where air cooling is impractical.



Enjoy the benefits of Pro+ membership, learn more and join.

A second benefit is long thermal ride-through. In an air-cooled data center, a cooling system failure or disruption would normally overheat IT equipment in a matter of minutes. Most data centers respond to such situations by migrating workloads or shutting down systems in an orderly manner as long as uninterruptible power supply systems or other backup power lasts. With liquid immersion cooling, the greater thermal mass of the liquid holds far more heat energy and keeps equipment cool for much longer periods in the event of pump failures. Two-phase immersion cooling is passive by nature, requiring no pump for the immersion bath itself.

Cost reduction is often cited as a third major benefit of liquid cooling, usually due to a reduction in cooling power expenses, an elimination of computer room air conditioner (CRAC) acquisition and maintenance costs, and simplified server components.

"DLC [direct liquid cooling] suppliers claim up to 40% lower cooling costs," said Andrew Donoghue, European research manager for data center technologies at 451 Research. "This eliminates fans that create noise and vibrations and are often the source of server failures."

But liquid immersion cooling also has a number of drawbacks. Perhaps the biggest issue is that most of today's data center racks don't require enough power to justify the financial and logistical move from air to liquid cooling -- after all, the CRAC is already bought and running, and today's servers operate just fine in air cooling. The real push for liquid immersion cooling will come with future high-density, high-performance server designs that cannot be adequately air cooled.

Liquid immersion cooling is a highly disruptive data center technology, dramatically changing equipment layouts. Today's data center scalability is based on vertical racks, but with liquid immersion cooling, racks must be placed horizontally into tanks. In addition, the presence of liquid makes adding or exchanging servers messier and more time-consuming.

"It's harder to retrofit existing data centers and harder to service the [submerged] equipment," said Scott Gorcester, CEO of cloud provider VirtualQube in Bothell, Wash. The specter of leaks and spills has also been a major barrier to adoption.

Gorcester and Donoghue also cite potential problems with liquid maintenance and disposal. Even when liquid baths are enclosed, it will be important to filter or clean the liquid to remove environmental and biological contaminants that obstruct liquid flow and possibly even create health hazards.

It's best to use clear coolants with light viscosity and high heat-retention that are eco-friendly and non-toxic, such as a mineral oil, noted David J. Cappuccio, research VP at Gartner. Industry standards do not yet exist for liquid selection, formulation, filtering (cleanliness) and disposal.

Taking the plunge into immersion cooling

Liquid immersion cooling systems are on the market today, but it's important to consider some logistical issues of adoption before embracing this new technology.

Liquid immersion cooling systems will work with current server designs and other data center equipment, so it is possible to adopt immersion cooling in planned stages over time. For example, if a new rack of high-density gear won't fit within the confines of an established containment area, consider a liquid immersion cooling system.

"I see this technology today especially suited for harsh environments and supercomputing applications," said Gorcester.

However, the full benefit of liquid cooling is only realized with servers and racks that cannot be air-cooled. For that reason, many businesses are waiting to deploy liquid cooling until high-density systems are available for next-generation greenfield data center builds.

Any data center retrofit or build to support liquid immersion cooling should include a structural analysis to ensure that the floor loading accommodates the added weight of the liquid in addition to the racks and IT gear. Installation will also require pumps, chillers, filters, piping and sensors, so work closely with the cooling vendor to ensure that liquid cooling installations still meet proper data center design practices. And be aware that liquid immersion uses horizontal racks -- not vertical racks. This will affect the way gear is arranged in the data center.

Current servers use fans, heat sinks and complicated ductwork to direct airflow across processors and memory components. Although servers are entirely compatible with liquid immersion cooling, all of those air-cooling components must be removed before immersing the server.

"You can put off-the-shelf gear in liquid if you have spare time to remove the heat sinks and clean up the [thermal] grease, but the major benefits come when you build really dense equipment," said Alex Kampl, VP of engineering at Allied Control Limited.

Solid state disk devices can be immersed, but conventional magnetic disk drives often include a pressure-equalizing port, which may pose problems for immersion. Disks may need to be removed and relocated to an external (dry) storage subsystem or storage-area network unless they are specifically rated or tested for immersion.

Optical cabling can pose another problem. While electrical cables like Category 6a Ethernet cables work fine when immersed, the presence of liquid can change the index of refraction in an optical interface (where a fiber cable and port meet), and this may diminish network performance -- even a small effect can impair 10/40/100 GigE backbone connections, high-performance Fibre Channel connections or other ports where fiber cabling is needed.

Review the system warranties and maintenance contracts before you reach for that screwdriver. Many vendors resist equipment retrofits and refuse to honor service agreements for submerged systems. After all, each server is designed, built, tested and shipped a certain way. Pulling out the cooling parts and sinking the system in liquid probably hasn't been tested (or even envisioned) by the vendors. While those very same vendors may indeed produce systems designed for immersion in the future, it's important to verify (in writing) that they will continue to service the current systems after retrofit and immersion. If not, you may need to forestall adoption until high-density next-generation systems become available.

And never forget about the potential for leaks and mess. Direct liquid immersion cooling changes the way that IT professionals work with system hardware. Sure, the liquid is non-toxic and environmentally friendly, but nobody wants to spend a day at the office covered in mineral oil. And it's a bad idea to pull a server out of a liquid bath only to have it drain all over the floor while technicians try to work on it. You'll need protective clothing, workspaces that can retain residual liquid (for example, workbenches with lips along the edges), and cleanup gear for spills. Liquid cooling systems should incorporate double tanks to contain leaks, and comprehensive monitoring should detect and report even minor drips in the facility.

Don't hold your breath

The demand for more computing from fewer systems drives more processor sockets and memory onto high-end servers. As those servers and racks become too dense to cool with air, vendors will introduce immersion cooling to at least some models, and this will open the figurative floodgate for liquid immersion cooling in the data center.

Direct immersion liquid cooling systems are currently available, and they can be integrated into existing data centers with air-cooled servers. But adoption is extremely slow, considering the existing heavy investment in mechanical cooling methodologies and continually improving power efficiency of modern systems, which has slowed the upward spiral in data center heat densities. Consequently, analysts are reluctant to comment on adoption timeframes. But the potential benefits are enticing, and Donoghue's mid-2014 research expects more vendor entrants into the direct liquid cooling market in the next 18 months.

This was first published in October 2014

Dig deeper on Data center cooling

1 comment


Good technical descriptions of the technologies available but some serious problems with the "comments" and "views" of certain people. Cloud to my mind is HPC, and Cloud is gaining popularity very very fast, hence my belief that Liquid Cooling is not only here to stay but will probably never even be installed into legacy air cooled data centres, they will be stand alone units taking less space, power and cooling than air cooled, and costing so much less to build and operate that the financial argument will be compelling to non IT C Suite with the money!

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: