AI workloads transforming data centre infrastructure

29 April 2025

The rapid expansion of artificial intelligence (AI) workloads is significantly reshaping data centre network infrastructure, according to a recent study commissioned by Ciena.

Global data centre experts are forecasting a substantial rise in interconnect bandwidth requirements over the next five years, indicating that this transformation will impact organizations worldwide.

The survey involved over 1,300 decision makers in data centres across 13 countries. The results revealed that 53% of respondents believe that AI workloads will create the most significant demands on data centre interconnect (DCI) infrastructure in the next two to three years, outpacing traditional drivers such as cloud computing (51%) and big data analytics (44%).

To address the burgeoning requirements of AI, 43% of new data centre facilities are anticipated to be dedicated to AI workloads. Given the unprecedented data movement required for AI model training and inference, experts predict a major increase in bandwidth needs, with 87% of survey participants indicating they will require fibre optic capacity of 800Gbps or higher per wavelength for effective DCI.

"AI workloads are reshaping the entire data center landscape, from infrastructure builds to bandwidth demand," said Jürgen Hatheier, Chief Technology Officer, International at Ciena. "While network traffic has historically expanded at a rate of 20-30% per year, the rise of AI is set to accelerate this growth significantly, prompting operators to rethink their architectures in order to meet this demand sustainably.”

The survey also highlighted a growing acceptance of pluggable optics as a solution for managing bandwidth demands while addressing power and space constraints. An impressive 98% of data centre experts identified pluggable optics as essential for reducing both power consumption and the physical footprint of their network infrastructure. As the requirements for AI computing continue to evolve, the training of Large Language Models (LLMs) is expected to become increasingly distributed across various AI data centres. The study indicated that 81% of respondents believe LLM training will likely occur in distributed data centre facilities, necessitating interconnected DCI solutions.

When participants were asked about key factors influencing the deployment of AI inference, priorities emerged around AI resource utilization over time, reducing latency by positioning inference compute closer to users at the edge, meeting data sovereignty requirements, and providing strategic locations for key customers. Furthermore, a majority of participants (67%) expressed a preference for utilizing Managed Optical Fiber Networks (MOFN) rather than deploying dark fibre, indicating a shift toward leveraging carrier-operated high-capacity networks for long-haul data centre connectivity.

"The AI revolution transcends mere compute power—it fundamentally alters connectivity," said Hatheier. "To fully realize AI's potential, operators must ensure their DCI infrastructure is equipped to handle a future dominated by AI-driven traffic."