Skip to Content
Henkel Adhesive Technologies

Henkel Adhesive Technologies

Your data center is under water and hotter than ever

Anyone who’s jumped in a cold pool on a hot day understands the effectiveness of liquid cooling. This is certainly not a new concept. Automobile radiators, one of the first examples of liquid cooling in industry, have existed for nearly 125 years.
5 min.

Anyone who’s jumped in a cold pool on a hot day understands the effectiveness of liquid cooling. This is certainly not a new concept.  Automobile radiators, one of the first examples of liquid cooling in industry, have existed for nearly 125 years.  Using liquid cooling – among other temperature-reduction methods – is not a novel idea for the data center either, where uncontrolled heat from servers, high-powered processors and a myriad of electronics gear could cause serious issues.  While data center liquid cooling applications have been relatively limited to specific high-heat systems, compute-intensive and data-intensive workloads of new server configurations continue to drive temperatures higher, placing more interest on more broadly-applied liquid thermal management approaches. 

This is an image of a data center.

The physical design of the data center prioritizes heat reduction because, when it comes to electronic systems, high temperature is a performance-killer and a potential system-destroyer, too.  While the absolute environmental temperature limit of the data center is 82° F, according to some industry recommendations, the ideal temperature range should fall somewhere between 73° - 75° F. [1] As you can imagine, that makes for a big air conditioning bill, not to mention a significant drain on the power grid.  Ensuring server systems (and other electronics) are controlled as much as possible for temperature is, therefore, critical not only for the uninterrupted storage and processing of data, but for more energy-efficient, lower-cost operations as well. 

In addition to active air cooling (air conditioning, fans, etc.) and optimized structure designs such as raised floors that help maximize air flow opportunities, cooling electronic systems – particularly in server racks – from the inside out is equally vital.  The use of thermal interface materials (TIMs), heat sinks and liquid cooling systems within high-density server electronics are the primary methods for achieving in-application temperature reduction.  For all of these approaches, massive innovation is underway as data volumes and speeds driven by AI, data mining, and analytics are only getting more intense – and hotter.

This is an image of a linecard heatsink with microtim.

Go with the Flow, Only Faster

Liquid cooling in the data center takes many forms – from cooling pipes attached to cooling plates or chassis between PCBs/modules to full immersion cooling systems that submerge entire racks.  The idea is relatively straightforward:  a liquid coolant (water or dielectric coolant) circulates through pipes or other structures cooling the metal interface that acts as a heat collection device for the high-performance computing chips.  In the case of immersion cooling, components are completely submerged [2] in tanks, where dielectric coolant that does not harm the components circulates to reduce system operational heat.   While cooling plates for server boards/racks and immersion cooling have been used only in select areas of the data center, this temperature control method is projected to see a 20% CAGR from 2022 to 2028 [3] as data volume and intensity continue to rise. 

This is an image of a data center room.

Data center operators are driven not only to manage performance optimization through heat reduction, but also to ensure more sustainable, energy-efficient data factories. There is just so much air cooling can achieve when power densities are high.  Liquid cooling amplifies the temperature reduction equation and is doing so with recirculated (i.e. waste-reducing) water or coolants that are highly effective.  This lowers the drain on power significantly. The question is:  Can the positive effects of liquid cooling be made even more substantial? 

In the past, attempts have been made to further improve the heat dissipation of liquid cooling systems by using TIMs between the component and the metal liquid cooling pipes/plates/chassis.  This accelerates heat dissipation between what would otherwise be metal-to-metal contacts.  This idea has merit.  Unfortunately, the thermal interface materials available – in the form of pads, adhesives, gels and liquids – are either not suitable (gels and liquids) for the application or cannot withstand the friction (pads and adhesives) often induced by pluggable components into the housing and/or the PCB being inserted into partitioned, liquid-cooled plate structures. The materials are pushed or scraped off and essentially become ineffective. 

This is an image of thermal grease material on a server rack.

However, recent TIM innovation is showing potential as a liquid cooling enhancement.  Proven to deliver noteworthy heat reduction for transceiver pluggable optical modules (POMs), a durable micro-thermal interface material (mTIM) applied in an ultra-thin layer accelerates heat dissipation by providing a thermally conductive interface, which is superior to metal-to-metal heat transfer. Currently employed for OSFP 400 GbE POMs in data centers, this solution has demonstrated significant temperature reductions versus a metal-to-metal interface. The per POM heat reduction’s collective effect (one line card can contain up to 32 POMs) is even more substantial. While an investigation into mTIM’s performance with both pipe and immersion liquid cooling designs is in the early stages, the properties and attributes of the durable coating suggest that it may offer considerable cooling acceleration for liquid cooling structures.  The material is compatible with multiple metals, is ultra-thin at 25 µm (+/- 5 µm) and highly durable.  

As data center server racks get hotter and even the most effective liquid cooling solutions are pushed to their limits, thermal control innovation – like mTIM – may be a consideration for enabling an even more sustainable, high-performance operation.

This is an image of a transparent transceiver with microtim.

Resources

  • This is a futuristic concept image of a server in a data center.

    Phase change interface materials for next-gen data center ICs

    This case study looks at how low-pressure, low thermal impedance, phase change thermal interface material provides a much-needed solution for next-gen data center ICs.
    ...

    10 min.

  • This image displays a micro thermal interface coating on a pluggable optical transceiver.

    Durable TIM coating reduces heat and improves data center switch performance

    This case study looks at how durable, thin thermal interface coating reduces heat and improves data center switch performance.
    ...

    10 min.

  • This image showcases a thermal gel on a component.

    Heat dissipating gel for 5G infrastructure systems

    This case study looks at how environmentally-stable, high thermal conductivity, heat-dissipating gel delivers critical cooling for 5G infrastructure systems.
    ...

    10 min.

  • Graphic of an ac-dc power device exploded.

    BERGQUIST® LIQUI-BOND® delivers efficient solution for data center power supply

    Learn how a manufacturer of an AC/DC power supply leverages a robust thermal management solution for its compact design.
    ...

    5 min.

  • A visual example of thermal management materials on a chip set.

    Automation-friendly liquid gap filler delivers on thermal control

    Learn how a manufacturer of a power converter used thermal management materials to create a more efficient product.
    ...

    5 min.

  • An image of an ice block over a printed circuit board

    The heat is on

    Today, network performance, reliability, and durability are critical to datacom and telecom performance around the world. And when network performance is largely determined by power and cooling, the role of thermal management is only going to increase.
    ...
  • This is an image of a futuristic circuit board like a city at night

    Look small to go big

    In today’s world of unprecedented network and infrastructure expansion, the need for increased performance and stability is accelerating. This rapid expansion is further challenged by the need to process more data at faster speeds while also accommodating emerging technology developments.
    ...
  • This is an image of a network cable with fiber optical background

    The 2023 data center pulse report

    With an insatiable demand for faster networking speeds and throughput performance within the data center, 800 Gigabit Ethernet (GbE) is gaining momentum as the next big trend in networking to provide capacity to ever-growing customer demands.
    ...
  • This is an image of a man in a data center bending down

    The 2024 data center pulse report

    The influence of innovation and technology on the need to transition from 800G to 1.6T.
    ...