Dark
Light

The Energy Challenge: Can AI Transform Efficiency Despite Growing Power Needs?

March 25, 2025

As artificial intelligence (AI) becomes more common in our daily lives, it brings with it some big questions about energy use. Did you know that a single ChatGPT query uses about ten times the electricity of a typical Google search? This difference highlights a larger issue: the increasing energy demands of digital technology, which already account for 2% to 4% of global carbon emissions, putting them on par with the aerospace industry.

Let’s take a closer look at data centers. Around 11,000 of them worldwide consume as much energy as France did in 2022—about 460 TWh. So, what does the rise of generative AI mean for these numbers? Manuel Cubero-Castan from EPFL emphasizes the importance of understanding the full costs of generative AI for efficient use. He suggests we consider everything from mineral extraction to the disposal of electronic waste. This broader perspective uncovers environmental challenges that extend beyond just the energy use of data centers.

Currently, most information about digital technology’s energy consumption focuses on data centers. According to the International Energy Agency (IEA), these centers used between 240 TWh and 340 TWh in 2022, making up 1% to 1.3% of global consumption. Despite a 4% annual increase in the number of centers, improvements in energy efficiency kept overall power use steady from 2010 to 2020. However, things are changing with the widespread adoption of generative AI.

Generative AI relies on large language models (LLMs) and consumes energy during both training and operation. Historically, training was the bigger energy hog, but now the operational phase, especially responding to prompts, demands more power—accounting for 60% to 70% of energy use, according to Meta and Google data. If we compare ChatGPT to Google, the differences are stark: a ChatGPT query uses about 3 Wh, whereas a Google search requires just 0.3 Wh. If all Google searches suddenly switched to ChatGPT, it could mean an extra 10 TWh annually.

Goldman Sachs Research predicts a 160% increase in data center electricity use over the next five years, potentially doubling carbon emissions. By 2028, AI might make up around 19% of data centers’ energy consumption. However, this forecast could change with innovations from companies like DeepSeek, which has developed a less energy-intensive AI model.

Another challenge is the availability of resources for chip production. Nvidia, which dominates 95% of the AI chip market, used 13.8 TWh of power in 2024 with its H100 chips. This could rise to between 85 and 134 TWh by 2027. But can Nvidia keep up this level of production? There’s also the question of whether existing power grids can handle increased loads, as many are already strained. Data centers often cluster in specific regions, complicating distribution. For example, they account for 20% of Ireland’s and over 25% of Virginia’s power consumption. “Locating data centers where resources are scarce isn’t sustainable,” cautions Cubero-Castan.

There’s also the financial aspect: for Google to handle generative AI queries, it would need 400,000 more servers, costing around $100 billion—a scenario that would wipe out profits. While the rising power consumption of generative AI is concerning, potential AI benefits could offset this. AI might drive energy innovation, helping users optimize consumption, utilities manage grids more effectively, and engineers advance in modeling and research. Realizing these benefits depends on consumer adoption and regulatory frameworks.

Future data centers are being designed for greater energy efficiency and capacity flexibility. Nvidia is working on enhancing chip performance to reduce power needs, and quantum computing holds promise. Currently, 40% of data center energy goes to cooling, another 40% to running servers, and 20% to other components. Initiatives like EPFL’s Heating Bits explore new cooling methods and renewable integration.

Clearing data clutter offers another energy-saving route. Most of the 1.3 trillion gigabytes generated daily becomes unused “dark data,” contributing significantly to carbon emissions. Loughborough Business School suggests 60% of current data is dark, equating to the emissions of three million London–New York flights. “While generative AI could offer energy gains, without reducing usage and improving infrastructure efficiency, overall emissions will remain high,” warns Cubero-Castan.

Though AI’s energy impact is currently small globally, it’s adding to the substantial power needs of digital technology. Video streaming, online gaming, and cryptocurrency are major data traffic drivers. Meanwhile, economic growth, electric vehicles, and manufacturing remain primary power demand factors, with fossil fuels still the dominant energy source.

Don't Miss