Wednesday, January 21, 2026

Energy is an Issue for AI, but Not Existential

High-performance computing remains an energy-intensive business, but we must remain alert for non-linear change, as suppliers will have huge incentives to reduce power consumption. 


A common mistake is to assume“if AI usage grows 10×, electricity consumption must grow 10×.” And, to be sure, consumption will grow as usage grows.

 

source: NextEra


 

source: NextEra


source: NextEra


But energy requirements have never grown in a linear way with demand growth. Instead, computing energy usage per unit shows a clear pattern:

  • Demand grows exponentially

  • Energy per unit falls faster than demand rises

  • Total energy grows—but sub-linearly.


Era

Energy per computation

Mainframes

Extremely high

Minicomputers

↓ ~10×

PCs

↓ ~100×

Mobile SoCs

↓ ~1,000×

Specialized accelerators

↓ another 10–100×


Domain-specific silicon, tuning inference operations to the level of required processing, model efficiency and the shift to less-energy-intensive inference operations all will play a part. 


When any key input, such as power, becomes the bottleneck, innovation follows. 


And, by the way, we might also note that every new computing wave has produced fears that energy supply would not keep up, as some argued:


U.S. Department of Commerce, between 1966 and 1972, warned that centralized computing facilities could require “significant regional generation capacity.” IBM internal studies from the late 1960s likewise studied power density limits in computer rooms. That has not proven to be the case. 


Instead, computing power consumption has been cut in half (per operation) about every 19 months. 


Sure, AI operations will consume more power. But growth will not be linear, and will not be unmanageable. 


No comments:

Energy is an Issue for AI, but Not Existential

High-performance computing remains an energy-intensive business, but we must remain alert for non-linear change , as suppliers will have hu...