Dark
Light

Transforming Data Centers: Meeting AI’s Demands for the Next Decade

March 14, 2025

As we look ahead to 2025 and beyond, it’s clear that the world of data center construction is going through a major shift. Unlike a decade ago, today’s data centers need to be ready for the complex demands of artificial intelligence (AI) workloads. David Ibarra brings this to light in his recent writing. At the core of this change is the need to support AI tasks, which include both training and inference operations. These tasks heavily depend on graphics processing units (GPUs) because they handle parallel computations far better than traditional central processing units (CPUs).

The arrival of AI solutions like ChatGPT in late 2022 has sped up this evolution, pushing tech companies to rethink their infrastructure strategies. AI training is power-hungry, requiring GPU arrays to work together on large datasets. This results in significant power consumption, ranging from 90 to 130 kW per rack, which calls for advanced cooling systems. Inference tasks, however, use less power, generally between 15 and 40 kW per rack. To give you some context, a single ChatGPT query uses about four times the energy of a standard Google search.

To keep up with these demands, modern data centers are becoming large-scale facilities. Individual buildings now need around 100 MW of power, while entire campuses can consume up to 1 GW. This shift has led to the adoption of more efficient liquid cooling systems over the traditional air-based methods. Future data centers must decide whether they’ll primarily handle training or inference tasks. The infrastructure must support high initial power requirements, exceeding 100 MW per building, with scalability up to 1 GW. Higher voltage systems are crucial to manage these demands and address the thermal limitations of power cables.

Cooling systems must evolve to support complex IT environments that mix GPUs, CPUs, storage, and networking components. This requires a hybrid cooling approach, using both air and liquid systems. Additionally, increasing fiber requirements are impacting facility space and weight. Data halls are also changing, with added vertical space for infrastructure layers like busways, cable trays, and water piping. The race for efficiency demands a reduction in design and construction cycles, leveraging prefabrication to enhance speed and safety.

Existing data centers face challenges adapting to AI, especially for inference workloads. This often involves retrofitting electrical systems and integrating liquid cooling, similar to earlier data center evolutions. Training facilities may need new sites to meet power and networking needs. Despite Nvidia’s GPU advancements in cost and performance, overall power consumption continues to rise with usage, echoing Jevons Paradox. The AI industry’s growth parallels Moore’s Law, turning data centers into massive GPU units. This rapid growth is reshaping energy markets, transitioning from steady growth to exponential increases.

Industry adaptations include establishing AI data centers in energy-rich remote areas, repurposing old power plants, and developing dedicated power sources. Partnerships between utilities and tech companies are crucial to investing in future power technologies, including nuclear. The data center industry’s expansion faces challenges like manufacturing limits, builder shortages, and a lack of skilled workers. Yet, there’s optimism, driven by AI’s potential to innovate and meet new demands. The evolution of data center infrastructure is vital to AI’s broader development, necessitating collaboration between tech firms, utilities, and construction experts to keep pace with this rapidly growing sector.

 

Don't Miss