It seems clear that the amount of AI processing and workloads are going to grow quite substantially, assuming AI proves to be as useful as most of us now believe. It might be at the high end of estimates, but some believe AI operations will consume as much as 80 percent of data center power loads by about 2040.
Other estimates also are significant. Digital Bridge CEO Marc Ganzi believes AI will mean a new or additional market about the size of the whole public cloud computing market, eventually.
If the public cloud now represents about 13 gigawatts of capacity, AI might eventually require 38 gigawatts, says Ganzi.
The whole global data center installed base might represent something on the order of 700 gigawatts, according to IDC. Other estimates by the Uptime Institute suggest capacity is on the order of 180 GW.
According to a report by Synergy Research Group, the global public cloud computing industry now represents 66.8 gigawatts (GW) of capacity.
According to a study by the Lawrence Berkeley National Laboratory, AI-driven data center electricity consumption could increase by 50 percent to 200 percent by 2040, posing new challenges for data center operators trying to limit and reduce carbon emissions and electrical consumption.
Of course, data center operators will continue to seek ways to reduce impact, as well.
But there seems little doubt that AI model training and inference generation will become a much-bigger part of data center compute activities and therefore energy load. In some part, that is because bigger models require more data ingestion during the training process.
No comments:
Post a Comment