High-performance computing remains an energy-intensive business, but we must remain alert for non-linear change, as suppliers will have huge incentives to reduce power consumption.
A common mistake is to assume“if AI usage grows 10×, electricity consumption must grow 10×.” And, to be sure, consumption will grow as usage grows.
But energy requirements have never grown in a linear way with demand growth. Instead, computing energy usage per unit shows a clear pattern:
Demand grows exponentially
Energy per unit falls faster than demand rises
Total energy grows—but sub-linearly.
Domain-specific silicon, tuning inference operations to the level of required processing, model efficiency and the shift to less-energy-intensive inference operations all will play a part.
When any key input, such as power, becomes the bottleneck, innovation follows.
And, by the way, we might also note that every new computing wave has produced fears that energy supply would not keep up, as some argued:
Mainframes would overwhelm power grids
PCs would consume too much electricity
The internet would break energy systems
Data centers would cause permanent shortages
Crypto would absorb global power
U.S. Department of Commerce, between 1966 and 1972, warned that centralized computing facilities could require “significant regional generation capacity.” IBM internal studies from the late 1960s likewise studied power density limits in computer rooms. That has not proven to be the case.
Instead, computing power consumption has been cut in half (per operation) about every 19 months.
Sure, AI operations will consume more power. But growth will not be linear, and will not be unmanageable.
No comments:
Post a Comment