From niche to mainstream: can direct liquid cooling make the jump?

by | Nov 1, 2018 | Articles, Innovation

Liquid cooling may have historically been reserved for R&D supercomputers or mainframes yet increasing demands, workloads and the rise of machine learning are forcing data centre managers to rethink the technology. An announcement from Google earlier this year raised the question once again of when will direct liquid cooling become more prominent in the data centre engineering community.

The Alphabet company said its Tensor Processing Unit (TPUs) chips are so powerful that for the first time it would have to introduce liquid cooling into its data centres.

Liquid cooling has been promising to take over from air as the king of data centre cooling for some time. Yet,  air to air (air to water to air) continues to provide the lion’s share of cooling applications.

Steven Hamond, director of computational science at the U.S.A’s National Renewable Energy Laboratory (NREL) believes a lack of standardisation could be responsible.

“At the moment, it’s all a bit ad hoc how the industry does direct liquid cooling,” he says. “It would be nice if the solutions were a bit more standardised. Right now, it feels like a ‘mom and pop’ type industry, with many solutions being one offs.

“There’s a natural technology maturation process that goes on and we’re starting to see some bigger players. There’s the cost aspect too: we have to get the cost to cool per server down, because the margins in this are so low.”

Global ambition

NREL, funded by the Department of Energy in the US, set out in 2006 to create one of the world’s most efficient data centres. The plan was to build a data centre that would be the most energy efficient in the world. Direct liquid cooling was chosen as part of a “holistic energy efficient design”. Commissioned in 2012, the results speak for themselves.

With an annual average power usage effectiveness (PUE) of 1.04, the Colorado facility opened with 1MW of IT load capacity. Plans are underway to double this. Furthermore, as the building recycles waste heat, NREL claims the set up saves a whopping US$1 million per year in energy costs compared to if the same sized installation was air cooled.

In terms of the cooling technology, multiple vendors bid on the project but HP Enterprise won with its Apollo 8000 System that uses warm-liquid cooling technology. Called Peregrine, it was the first installation of its type for HP and the system has petascale computing capability (2.25 quadrillion calculations per second peak performance).

Direct liquid cooling testbed

Aside from its own data centre, NREL has also acted as a ‘test hub’ for vendors, with liquid cooling solutions from five different vendors now active, including cold plates and clam shell immersive liquid cooling. This includes technology from Asetek, Aquila, LiquidCool Solutions and MotivAir Chilled Doors.

The systems can be run side-by-side and visitors, even potential customers, can see the technologies in action. NREL’s role is to provide objective feedback on their effectiveness, how well do they work and what percentage of heat is captured to liquid.

Looking ahead Hamond believes the evolution of computing demands will help shift the adoption of liquid cooling solutions.

“Whether it’s cryptocurrency, or machine learning, the proliferation of GPU systems is growing rapidly and they’re very power dense. Where this is headed is liquid cooling will become more common place, out of necessity,” he says.

Going modular

British company Iceotope believes that direct liquid cooling can no longer be seen as a niche technology.

With a strong story, not to mention a high-performance computing (HPC) flagship installation at the University of Derby, it raises the question of why such liquid cooling technologies have not become commonplace?

“The fundamental job description of most of the people who run and manage data centres is don’t screw up,” says David Craig, CEO of Iceotope. “Can you think of more terrifying words for somebody who’s risk averse than disruptive technology?”

He believes that wider education is needed in the industry on the benefits of direct liquid cooling.

“There is definitely an educative part of the process, there’s no doubt about that,” he says. “This will also tie in with Corporate Social Responsibility (CSR) basics. The industry is using more energy and more water. There will be a point when there’s a legislative shift, if we don’t change ourselves, we will be forced to change whether we like it or not.”

Called the Ku:l system, Iceotope provides a “non-submersive immersion” technology, a sealed, 1U chassis to house electronics and uses a dielectric coolant which it says is “non-oily, non-solvent, non-toxic, leaves no residue and does not affect the electronics”.

Without the need for chilled air, industrial fans or pumps, the direct cooling system runs silently, according to the organisation. A gentle recirculating coolant flow inside the sealed chassis enables the harvested heat to be transferred to a building circuit via a plate heat exchanger.

The CEO sees hyperscale data centre developments, as well as edge, as potential growth areas for the business.

SmartPod and synthetic liquid

Diarmuid Daltun from Submer Technologies agrees that education is needed in the industry but says he is witnessing a change.

“People don’t know what they’re afraid of, early applications of direct liquid cooling used mineral oil, which was bad for the hardware in the long-term. Earlier experiences of the technology faced several challenges.”

He adds: “Furthermore, the pain point with air cooling has not been large enough to get people to change.”

Submer Technologies was launched in March 2018 from an R&D project. The company has made rapid progress and now reports a 20-strong order book, with interest from across Asia, Vietnam, Hong Kong and Singapore, through to Europe, with the UK, Italy and Switzerland being some of the early adopters.

A “synthetic liquid” is used in the SmartPod process, which he says is “specially crafted” for this application. The synthetic dielectric coolant doesn’t conduct electricity, presenting zero risk of electrocution. Specific additives are added to the liquid to help prevent the destabilization or loss of electrons. Part of the company’s pitch is that you “could submerge multiple toasters, hairdryers, stick both your hands in the liquid and only enjoy the moisturising effect”.

Daltun says another reason historically holding back the switch from air to liquid was people had less care for efficiencies. However, now PUE improvements can range from 2.5 down to 1.03 by making the switch to immersion liquid cooling, he says.

Looking ahead, Daltun believes we will see further traction in “supercomputing” and predicts these applications will opt for direct liquid cooling immersion.

Losing the leaks

Elsewhere, other technologies are gaining traction. In September Germany company Cloud & Heat secured a €10 million investment from ETF Partners. Established in 2010, the Dresden-based company developed a modular, water-cooled ‘data centre in a box’. The process uses direct-to-chip cooling, warm water is circulated directly through pipes and heatsinks to CPUs and RAM, the hottest elements of the server. The warm water can then be re-purposed for heating purposes.

Another noteworthy technology is Californian company Chilldyne’s direct-to-chip approach to liquid cooling, which delivers coolant under negative pressure. Using a Cool-Flo pistonless pump, it’s the negative pressure element which the company believes addresses another historic concern with liquid cooling: leakage.

Chilldyne says that if a tube is severed air will flow into the system, instead of coolant leaking out. A demonstration video on the company’s website shows a fluid line being cut with pliers. Instead of flowing out, the coolant simply retracts back up the line.

Common interface

Such innovations in general show that progress is happening with regards to direct liquid cooling, whether it’s cold plates, immersion or direct-to-chip approaches. NREL’s director of computational science encourages such developments but strongly believes a common requirement on connectors is needed.

He likens the situation to getting into your car and “not knowing which side the steering wheel is on”.

“We need standardisation at the rack level – let’s decide on a common interface as a data centre owner/operator. Then, all we must think about is how many connections and where do they sit, as opposed to what types of connections and what’s going. That would allow the facility to plan for that and then allow the computing industry and the companies selling cooling solutions to innovate, so long as there’s an agreed upon interface.”

UK company Future-tech has worked on and provided M&E infrastructure designs for several direct liquid cooling projects and applications since 2012.

“The technology has clear thermodynamic advantages when compared to air-based systems,” says James Wilman, CEO of Future-tech. “However, air provides a simple and universal interface. Together with this, the cost per kW delivered to the IT equipment is a vitally important metric and while there is cost and interface uncertainty it makes the switch from air much more difficult, particularly for wholesale co-location providers.”

Wilman adds: “Ultimately, any switch will be client-led and will start in owner operator facilities. Direct liquid cooling’s acceptance into the mainstream is almost certainly going to be linked with its adoption by the Hyperscalers. Once these organisations are successfully deploying the technology in to their own facilities, at scale, it will then open the door and provide pressure on their co-location supply chain to facilitate the technology. Even then the crossover is likely to take many years before direct component cooling becomes the norm.”

According to the CEO, co-location facility designers “need to think about buildings and infrastructure fabrics that are able to accommodate a shift to direct liquid in the future, while still providing a cost-effective air to air platform today”.

Related content: Making waves with cooling solutions