Tech News
← Back to articles

The showers and baths keeping data centre tech cool

read original related products more articles

The showers and baths keeping data centre tech cool

58 minutes ago Share Save Chris Baraniuk Technology Reporter Share Save

The Washington Post via Getty Images Data centres can't function without cooling systems

They work 24/7 at high speeds and get searingly hot - but data centre computer chips get plenty of pampering. Some of them basically live at the spa. "We'll have fluid that comes up and [then] shower down, or trickle down, onto a component," says Jonathan Ballon, chief executive at liquid cooling firm Iceotope. "Some things will get sprayed." In other cases, the industrious gizmos recline in circulating baths of fluid, which ferries away the heat they generate, enabling them to function at very high speeds, known as "overclocking". "We have customers that are overclocking at all times because there is zero risk of burning out the server," says Mr Ballon. He adds that one client, a hotel chain in the US, is planning to use heat from hotel servers to warm guest rooms, the hotel laundry and swimming pool.

Without cooling, data centres fall over. In November, a cooling system failure at a data centre in the US sent financial trading tech offline at CME Group, the world's largest exchange operator. The company has since put in place additional cooling capacity to help protect against a repeat of this incident. Currently, demand for data centres is booming, driven partly by the growth of AI technologies. But the huge amounts of energy and water that many of these facilities consume mean that they are increasingly controversial. More than 200 environmental groups in the US recently demanded a moratorium on new data centres in the country. But there are some data centre firms that say they want to reduce their impact. They have another incentive. Data centre computer chips are becoming increasingly powerful. So much so that many in the industry say traditional cooling methods – such as air cooling, where fans constantly blow air over the hottest components – is no longer sufficient for some operations. Mr Ballon is aware of rising controversy around the construction of energy-devouring data centres. "Communities are pushing back on these projects," he says. "We require significantly less power and water. We don't have any fans whatsoever – we operate silently."

Iceotope Iceotope says its tech can cut the cost of cooling by up to 80%

Iceotope says its approach to liquid cooling, which can soothe multiple components in a data centre, not just the processing chips, may reduce cooling-related energy demands by up to 80%. The company's technology uses water to cool down the oil-based fluid that actually interacts with computer tech. But the water remains in a closed loop, so there is no need to continually draw more of it from local supplies. I ask whether the oil-based fluids in the firm's cooling system are derived from fossil fuel products and he says some of them are, though he stresses that none contain PFAS, also known as forever chemicals, which are harmful to human health.

Some liquid-based data centre cooling technologies use refrigerants that do contain PFAS. Not only that, many refrigerants produce highly potent greenhouse gases, which threaten to exacerbate climate change. Two-phase cooling systems use such refrigerants says Yulin Wang, a former senior technology analyst at IDTechEx, a market research firm. The refrigerant starts out as a liquid but heat from server components causes it to evaporate into a gas and this phase change soaks up a lot of energy, meaning it is an effective way of cooling things down. In some designs, data centre tech is fully immersed in large quantities of PFAS-containing refrigerant. "Vapours can get out of the tank," adds Mr Wang. "There could be some safety issues." In other cases, the refrigerant is piped directly to the hottest components, computer chips, only. Some companies that offer two-phase cooling are currently switching to PFAS-free refrigerants.

Yulin Wang Yulin Wang warns of safety issues with some cooling chemicals

Over the years, firms have experimented with wildly different approaches to cooling, in a race to find the best means of keeping data centre gadgets happy. Microsoft famously sank a tube-like container full of servers into the sea off Orkney, for example. The idea was that cold Scottish seawater would improve the efficiency of air-based cooling systems inside the device. Last year, Microsoft confirmed that it had shuttered the project. But the company had learned much from it, says Alistair Speirs, general manager of global infrastructure in the Microsoft Azure business group. "Without [human] operators, less things went wrong – that informed some of our operational procedures," he says. Data centres that are more hands-off appear more reliable. Initial findings showed the subsea data centre had a power usage effectiveness, or PUE, rating of 1.07 – suggesting it was far more efficient than the vast majority of land-based data centres. And it required zero water. But in the end, Microsoft concluded that the economics of building and maintaining subsea data centres weren't very favourable. The company is still working on liquid-based cooling ideas, including microfluidics, where tiny channels of liquid flow through the many layers of a silicon chip. "You can think of a liquid cooling maze through the silicon at nanometre scale," says Mr Speirs.