Skip to Content
Henkel Adhesive Technologies

Henkel Adhesive Technologies

What’s holding AI back? power and heat

AI accelerators drawing over a kilowatt each push rack power beyond 60 kW, making power delivery and cooling the real challenges. Traditional air cooling and power rails can’t keep up. Liquid cooling, high-voltage distribution, and advanced materials are essential to sustain performance and reliability. True AI power lies in the infrastructure supporting the chips, not just the chips themselves.

Tom Sicilian
Key Account Manager - Data/Telecom

5 min.

AI accelerators like NVIDIA’s H100 and GB200 are pushing power consumption to unprecedented levels—700 to 1,200 watts per chip, with the upcoming GB300 reaching 1,400 watts. In dense deployments like the NVLink 72 rack—housing 72 GPUs across 18 compute trays—rack-level power demands can exceed 60 kW.

Legacy air cooling, originally designed for 200–300 W CPUs, is struggling to keep pace, especially in dense 1U and 2U server formats common at the edge and in telecom deployments. Limited airflow and fan capacity cause thermal gradients, hotspots, and throttling, reducing reliability and uptime.

Liquid cooling has evolved from a niche technology to a necessity. Direct Liquid Cooling (DLC) using cold plates reduces thermal resistance and supports higher densities. Immersion cooling is emerging in hyperscale AI clusters, pushing racks beyond 100 kW with power usage effectiveness (PUE) below 1.2. These liquid solutions lower thermal deltas by 10–15°C compared to air cooling, which is critical for preserving hardware performance and longevity.

Power delivery is equally challenging. Modern GPUs require over 600 amps at voltages below 1 V, placing extreme demands on onboard voltage regulator modules (VRMs) that must maintain over 90% efficiency under rapid transient loads. Power delivery networks (PDNs) must use low-inductance interconnects and effective decoupling to control voltage droop and ripple.

At the rack level, many operators outgrow traditional 12V or 48V power rails. Hyperscalers are evaluating ±400 VDC distribution to reduce current, minimize cable bulk, and free internal chassis space. This transition affects PSU design, insulation standards, and EMI management, presenting new design frontiers.

Telecom and edge: Unique constraints demand new thinking

Unlike hyperscale datacenters, telecom and edge deployments face tighter space and power constraints, often requiring compliance with NEBS standards and operation in ambient temperatures exceeding 50°C, all within legacy 19” rack formats. Deploying AI inference or 800G switching in these environments necessitates innovative thermal and electrical strategies.

Co-packaged optics (CPO) illustrate these challenges. By integrating optics with high-bandwidth switch ASICs, CPO reduces latency and power per bit but increases thermal density and raises EMI and mechanical integrity issues. Managing heat, vibration, and long-term reliability demands materials like thermal interface materials (such as gap fillers), along with low-modulus adhesives to maintain mechanical integrity. Incorrect mechanical layering risks early-life failures.

This complex environment means thermal and power systems must be co-designed with compute hardware. Mechanical, electrical, and packaging teams can no longer operate in silos. Every watt saved and degree reduced translates into greater performance, uptime, and deployment flexibility.

This is the concept image of data center and AI

As a Key Account Manager collaborating with system architects and hardware teams, I witness firsthand how integrated system design becomes the true differentiator. Success is not just about delivering powerful chips—it’s about enabling those chips to run at full capacity 24/7 in real-world conditions.

In today’s AI-driven landscape, compute power is abundant; the real scarcity lies in delivering and cooling that power efficiently and reliably at scale—especially in telecom and edge contexts. The bottleneck is not the chip but the infrastructure around it.

Resources

  • This is a futuristic concept image of a server in a data center.

    Phase change interface materials for next-gen data center ICs

    This case study looks at how low-pressure, low thermal impedance, phase change thermal interface material provides a much-needed solution for next-gen data center ICs.

    10 min.

  • This image displays a micro thermal interface coating on a pluggable optical transceiver.

    Durable TIM coating reduces heat and improves data center switch performance

    This case study looks at how durable, thin thermal interface coating reduces heat and improves data center switch performance.

    10 min.

  • This is the frozen computer chip in a block of solid ice

    The heat is on

    Today, network performance, reliability, and durability are critical to datacom and telecom performance around the world. And when network performance is largely determined by power and cooling, the role of thermal management is only going to increase.
  • This image shows city view of San Francisco

    Look small to go big

    In today’s world of unprecedented network and infrastructure expansion, the need for increased performance and stability is accelerating. This rapid expansion is further challenged by the need to process more data at faster speeds while also accommodating emerging technology developments.
  • This is an image of a network cable with fiber optical background

    The 2023 data center pulse report

    With an insatiable demand for faster networking speeds and throughput performance within the data center, 800 Gigabit Ethernet (GbE) is gaining momentum as the next big trend in networking to provide capacity to ever-growing customer demands.
  • This is an image of a man in a data center bending down

    The 2024 data center pulse report

    The influence of innovation and technology on the need to transition from 800G to 1.6T.

Looking for solutions? We can help

Our experts are here to learn more about your needs.

  • A female call-center employee smiling and wearing a headset while working in an office.

    Request a consultation

  • LOCTITE 263 Threadlocker - Application on Bolt

    Request a sample

  • A black female employee scans packages in a warehouse. In the foreground there is the woman with the yellow scanner, in the background scaffolding can be seen.

    Submit an order request

Looking for more support options?

Our support center and experts are ready to help you find solutions for your business needs.