Friday, October 20, 2023

AI is Going to Change Data Centers

It seems clear that the amount of AI processing and workloads are going to grow quite substantially, assuming AI proves to be as useful as most of us now believe. It might be at the high end of estimates, but some believe AI operations will consume as much as 80 percent of data center power loads by about 2040. 


Other estimates also are significant. Digital Bridge CEO Marc Ganzi believes AI will mean a new or additional market about the size of the whole public cloud computing market, eventually. 


If the public cloud now represents about 13 gigawatts of capacity, AI might eventually require 38 gigawatts, says Ganzi. 


The whole global data center installed base might represent something on the order of 700 gigawatts, according to IDC. Other estimates by the Uptime Institute suggest capacity is on the order of 180 GW. 


According to a report by Synergy Research Group, the global public cloud computing industry now represents 66.8 gigawatts (GW) of capacity. 


According to a study by the Lawrence Berkeley National Laboratory, AI-driven data center electricity consumption could increase by 50 percent to 200 percent by 2040, posing new challenges for data center operators trying to limit and reduce carbon emissions and electrical consumption. 


Study

Year Published

AI-driven electricity consumption (GWh)

Increase over 2023 (%)

Lawrence Berkeley National Laboratory

2020

130

40%

Gartner

2021

200

50%

IDC

2022

300

75%

DigiCapital

2023

400

100%





Study

Year

Projected AI-Driven Data Center Electricity Consumption (2040)

Growth from 2023 (%)

Lawrence Berkeley National Laboratory

2018

10% of total data center electricity consumption

50%

Gartner

2020

15% of total data center electricity consumption

75%

IDC

2021

20% of total data center electricity consumption

100%


Of course, data center operators will continue to seek ways to reduce impact, as well. 


Study

Year Published

Energy Efficiency Savings (%)

Methods Used

Lawrence Berkeley National Laboratory

2020

20-30%

Using more energy-efficient hardware, optimizing the use of data center resources, and using renewable energy sources

McKinsey & Company

2021

30-40%

Using more energy-efficient hardware, optimizing the use of data center resources, using renewable energy sources, and improving cooling efficiency

IDC

2022

40-50%

Using more energy-efficient hardware, optimizing the use of data center resources, using renewable energy sources, improving cooling efficiency, and deploying AI-powered energy management solutions


But there seems little doubt that AI model training and inference generation will become a much-bigger part of data center compute activities and therefore energy load. In some part, that is because bigger models require more data ingestion during the training process. 



No comments:

Have LLMs Hit an Improvement Wall, or Not?

Some might argue it is way too early to worry about a slowdown in large language model performance improvement rates . But some already voic...