Skip to Content
Henkel Adhesive Technologies

Henkel Adhesive Technologies

800G/1.6T optics powering AI revolution

Explore how 800G and 1.6T optics are helping turn AI evolution into revolution by enhancing data center speed and efficiency. Discover how Henkel are driving innovation to meet the growing demands of AI applications.

Farida Jensen
Market Strategy Manager Growth

5 min.

With Artificial Intelligence (AI) hitting the headlines, from crystal-ball gazing about AI’s potential impact on how we work, to the latest ins and outs of the AI corporate world, it seems a good time to reflect on how we got here, and on what the future holds for AI.

In this blog, we leave the big questions to the experts and futurists, and focus instead on what AI is, how it developed, and the practical question of how datacenter technologies are innovating to meet the explosive growth in demand for AI capacity.

Specifically, it considers the role of 800G/1.6T optics in delivering the speed and bandwidth that datacenter networks need to deliver AI capabilities because, no matter how sophisticated it gets, the AI revolution can go nowhere without the network to deliver it.

Graphic of a digital brain and Human head outline made from circuit board, connecting on dark blue background.

What is AI, and how did it get here?

Anyone familiar with the movie ‘2001, A Space Odyssey’ will know that the concept of AI goes back at least 56 years, to the HAL 9000 computer which did so much to spoil the astronauts’ day. 

The writer, Arthur C Clark, claimed the acronym HAL didn’t come from shifting the letters ‘IBM’ one forward in the alphabet, as rumored, but stood for ‘Heuristically Programmed Algorithmic Computer’, which happens to be a pretty good starting point for defining an AI machine. 

A traditional computer is programmed using fixed algorithms and instructions. AI sets itself apart by combining these with heuristics, or the ability to learn for itself and make decisions or create content based on what it has learned, thus mimicking human intelligence. 

The history of artificial intelligence goes back long before the fictional HAL 9000.

  • Automata, human-like dolls and machines that seem to move by their own will, existed as long ago as the 17th century, as highlighted by a CNET1 article, ‘Before they were robots’.
  • In the summer of 1955, John McCarthy, a mathematics professor and pioneer in artificial intelligence, coined the term during a workshop at Dartmouth College, and it was in vogue through the 1950s.
  • Fast forward to 1966, when Joseph Weizenbaum created the first chatbot, ELIZA, inspired by Eliza Doolittle in Pygmalion. ELIZA was a simulated psychotherapist which used natural language processing (NLP) to converse with humans.
  • In the 1980’s, Japan’s Fifth Generation Computer Project (FGCP) set out to create a fifth generation of computers focused on inference based processing and AI capabilities. While it is little discussed now, it represents an early recognition of the potential transformational role of AI.
  • Chess grand master Gary Kasparov was famously and stunningly defeated by IBM’s Deep Blue computer in 1997.
  • A few years later, NASA’s Mars Rovers Spirit and Opportunity started navigating the red planet on their own. Today, there are two Mars Rovers, Curiosity and Perseverance, active on Mars. Both are equipped with the AEGIS AI system, using deep learning, computer vision and neural networks to investigate geological targets.
  • Apple Siri launched in 2011, with Amazon Alexa coming to market in 2014, and they have been helping us pick the best take-out on Friday night ever since.
     

These examples trace what was largely evolutionary development of AI, until November 30th 2022, when the launch of ChatGPT, and the first widely Generative AI capabilities, heralded what can be seen as the start of the AI revolution.

Whatever this revolution means for society, it is certain that it will have a big impact on the computing and datacenter industry, and innovations like 800G/1.6T optics are helping it rise to the challenge.

Dealing with data hungry AI in the datacenter

Generative AI and Deep Learning training modules are extremely data intensive. AI uses an iterative process to learn, where it repeatedly reviews large datasets to build its understanding. This requires significant compute and storage resources. A 2023 datacentral.com article highlights that ChatGPT-4 has 1.8 trillion parameters and a dataset that exceeds a petabyte for text-based generative AI, and this grows with each iteration as it continues to learn.

The Graphic Processing Units (GPU) in AI servers, especially in 800G clusters, are well equipped for handling computational extensive AI training and inference module tasks. The very reason companies like Nvidia are doing so well is because their GPU and other products are crucial to advancing AI.

Photo concept of IoT represented by various smart graphics

Why 800G/1.6T optics are key in the AI datacenter network

Generative AI training modules and datasets pose significant challenges in terms of large-scale data processing and low latency demands on the AI datacenter networks. Articles from Juniper Networks2 and Network World3  detail how two major datacenter network providers are addressing these., while a 2023 Keysight blog discusses the vital role of 1.6T networking for emerging technologies like AI.

To meet the growing bandwidth and speed demands of AI datacenter networking, both 800G ethernet and 800G/1.6T optics are required, with 800G ethernet currently in development. Various technology options exist, from traditional Linear- drive Pluggable Optics (LPO) transceivers to Co- Packaged Optics (CPO). While each of these options has its own pros and cons, they all play a part in the overall movement to 800G/1.6T.

Priyank Shukla of Synopsys, a member of the Ethernet Alliance, predicts that AI datacenter networks will be the first adopters and are likely to deploy 800G ethernet by 2025 and 1.6T ethernet by 2027. 800G optical transceivers have been released by many vendors. and 1.6T transceivers have also been demonstrated. A January 2024 Ethernet Alliance blog4 discusses these and other developments in more depth.

To support the AI datacenter networks, equipment providers are bringing 800G capable 51.2Tbps switches providing AI server connectivity. These switches can support 32X OSFP800, QSFP-DD800 ports, enabling the use of 800G/1.6T optics. 800G/1.6T optical transceivers address the challenges of the AI datacenter by providing increased bandwidth and speeds to efficiently process data and reduce latency for AI applications.

How Henkel adds value in the AI datacenter revolution

As a market leader in materials technology and specialist in materials solutions for datacenter applications, Henkel has a key role to play in building 800G/1.6T capability for the AI datacenter.

For example, datacenters consume 1-2% of the world’s energy, and AI is an energy intensive application. 800G/1.6T optical transceiver designers and manufacturers have to speed up their efforts in designing products with lower power consumption, and Henkel is working closely with leading optical transceiver manufacturers in designing lower power modules.

This is the image of blue circuit board.
Pursuing 1.6T - data's new north star
Read the 2024 Pulse Report here to see more insights on the state of data center technology

Resources

  • This is a futuristic concept image of a server in a data center.

    Phase change interface materials for next-gen data center ICs

    This case study looks at how low-pressure, low thermal impedance, phase change thermal interface material provides a much-needed solution for next-gen data center ICs.

    10 min.

  • This image displays a micro thermal interface coating on a pluggable optical transceiver.

    Durable TIM coating reduces heat and improves data center switch performance

    This case study looks at how durable, thin thermal interface coating reduces heat and improves data center switch performance.

    10 min.

  • An image of an ice block over a printed circuit board

    The heat is on

    Today, network performance, reliability, and durability are critical to datacom and telecom performance around the world. And when network performance is largely determined by power and cooling, the role of thermal management is only going to increase.
  • This is an image of a futuristic circuit board like a city at night

    Look small to go big

    In today’s world of unprecedented network and infrastructure expansion, the need for increased performance and stability is accelerating. This rapid expansion is further challenged by the need to process more data at faster speeds while also accommodating emerging technology developments.
  • This is an image of a network cable with fiber optical background

    The 2023 data center pulse report

    With an insatiable demand for faster networking speeds and throughput performance within the data center, 800 Gigabit Ethernet (GbE) is gaining momentum as the next big trend in networking to provide capacity to ever-growing customer demands.
  • This is an image of a man in a data center bending down

    The 2024 data center pulse report

    The influence of innovation and technology on the need to transition from 800G to 1.6T.

Looking for solutions? We can help

Our experts are here to learn more about your needs.

  • A female call-center employee smiling and wearing a headset while working in an office.

    Request a consultation

  • LOCTITE 263 Threadlocker - Application on Bolt

    Request a sample

  • A black female employee scans packages in a warehouse. In the foreground there is the woman with the yellow scanner, in the background scaffolding can be seen.

    Submit an order request

Looking for more support options?

Our support center and experts are ready to help you find solutions for your business needs.