Thursday, July 17, 2025

Future AI Energy Costs are Hard to Predict

Much has been made of a Goldman Sachs analysis of the impact of artificial intelligence on data center power costs, using the oft-quoted claim that AI queries require 10 times the electrical power of a search query.


The claim that AI queries require approximately 10 times the electrical power of traditional search queries has been widely cited, primarily based on estimates from the International Energy Agency (IEA) and other sources suggesting a single ChatGPT query consumes about 2.9 watt-hours (Wh) compared to 0.3 Wh for a Google search. 


But model efficiency, hardware improvements, workload variability, and the specific nature of queries can significantly affect energy use, suggesting the 10x claim is not universally the case.


Hardware and software efficiency is improving. For example, AI-related computer chips efficiency roughly doubles every 2.5 to 3 years. 


Software optimizations, such as those developed by MIT (the Clover tool), can reduce carbon intensity by 80 percent to 90 percent by adjusting workloads to off-peak times or using lower-quality models for less critical tasks.


Also, AI queries vary widely in computational intensity, with some tasks (text generation) being less energy-intensive than others (video generation).


So do search queries. Complex searches involving real-time data or multimedia can approach the energy use of simpler AI queries. And some estimates suggest that even if all Google searches were replaced with large language model (LLM)-powered searches, the global energy increase would be modest (an additional 10–29 TWh annually, compared to 460 TWh for all data centers in 2022). 


Some studies suggest AI energy demands actually are minimal. 


Table of Studies Challenging the 10x Claim

Study/Source

Publication Date

Key Findings

How It Challenges the 10x Claim

IEA: Energy and AI

April 9, 2025

Data centers account for a small share of global electricity demand growth; efficiency of AI chips has doubled every 2.5–3 years, reducing per-query energy use.

Suggests that efficiency gains and task variability (e.g., text vs. video) mean not all AI queries are 10x more energy-intensive.

MIT Sloan: AI Data Center Energy Costs

January 6, 2025

Software tools like Clover reduce carbon intensity by 80–90%; simple steps can cut 10–20% of data center energy demand.

Demonstrates that optimized AI workloads can significantly lower energy use, narrowing the gap with search queries.

University of Wisconsin: The Hidden Cost of AI

August 20, 2024

Microsoft improved chatbot servers to use 10x less energy; efficient generalization techniques reduce energy needs.

Shows that specific AI implementations can have much lower energy use, challenging the blanket 10x estimate.

Sustainability by Numbers: Impact of AI on Energy Demand

November 17, 2024

If all Google searches used LLMs, energy demand would increase by 10–29 TWh, a modest fraction of total data center use.

Indicates that per-query energy differences are less dramatic when scaled, due to efficiency and workload factors.

Harding & Moreno-Cruz: Watts and Bots

2024

AI’s energy demand increase, including economic spillovers, is very small.

Suggests that AI’s per-query energy impact is not as significant as claimed, due to broader efficiency trends.

Masanet et al.: Recalibrating Data Center Energy Use

2020

Data center energy use is lower than projected due to efficiency improvements.

Implies that AI’s contribution to energy demand is moderated, reducing the per-query energy gap.

Nature: AI Energy Demands

March 4, 2025

Lack of transparency leads to simplistic extrapolations; calls for more precise data on AI energy use.

Questions the reliability of the 10x claim due to insufficient granular data.

Breakthrough Institute: Unmasking AI Energy Demand

July 9, 2024

Energy intensity per computation has decreased 20% annually since 2010; AI’s energy impact may be overstated.

Argues that efficiency gains significantly reduce AI’s per-query energy use, challenging the 10x figure.


The point is that the energy consumption of AI queries depends heavily on the model, hardware, and task complexity, and the same applies to search queries, which can vary in intensity.


No comments:

Yes, Follow the Data. Even if it Does Not Fit Your Agenda

When people argue we need to “follow the science” that should be true in all cases, not only in cases where the data fits one’s political pr...