Thursday, December 26, 2024

Energy Consumption Does Not Scale with Work Loads as Much as You Think

Most observers will agree that data center energy efficiency (and carbon and other emissions footprint) is an important issue, if for no other reason than compliance with government regulations. And with cloud computing and data center compute cycles trending higher (more data centers, larger data centers, additional artificial intelligence workloads, more computing, more cloud computing, more content delivery), energy consumption and its supply will continue to be important issues. 


source: Goldman Sachs 


Perhaps the good news is that energy consumption does not scale linearly with the increase in compute cycles, storage or heat dissipation, though some might argue that data center energy consumption estimates are too low.  


From 2010 to 2018, data center computing cycles increased dramatically:

  • Data center workloads increased more than sixfold (over 500 percent)

  • Internet traffic increased tenfold (1000 percent).

  • Storage capacity rose by 25 times (2500 percent).


But data center energy consumption only grew by about six percent during this period. 


All that might be worth keeping in mind, as it seems data center computing operations are destined to increase in volume. And though efficiencies will happen, it will be difficult to offset the impact of increased compute volume. 


It might also be worth noting that computing workloads also happen on end user devices of all types, including AI inference operations on smartphones, for example. 


If we assume that In 2020, the information and communication technology sector as a whole, including data centers, networks and user devices, consumed about 915 TWh of electricity, or four percent to six percent of all electricity used in the world, and if data centers specifically consumed less than two percent of that total (maybe one percent to 1.8 percent globally), then all the other parts of the ecosystem (devices, software mostly “at the edge”) might have consumed two- to four-percent of total energy in the information and communications industries (including networks, cell towers and so forth as well as end user devices). 


Still, many end user devices--especially smartphones --actually consume very little energy, even assuming inference operations are added to the processing load. Charging a phone once a day uses 0.035 kilowatt-hours (kWh) of electricity per week, 0.15 kWh per month, and about 1.83 kWh per year. 


In the United States, that works out to energy costs of 40 cents or less per year. That is almost too small an amount to measure. 

source: EnergySage 


At an average electricity price of $0.13 per kWh, this translates to approximately 1.25 billion kWh or 1.25 TWh of electricity consumed annually for smartphone charging.


The implication is that even common AI inference operations on smartphones are not going to be too meaningful a source of energy consumption. 


For example, assuming 250 million smartphone users in the United States and an average annual charging cost of $0.65 per phone, 250 million users * $0.65 per year implies $162.5 million in electricity costs annually. 


That is less than 0.1 percent of the total electrical consumption for the United States in a year.


Perhaps the point is that AI inference operations we can run on smartphones (probably centered on personalization, photo and voice interface operations) are a wise choice.


No comments:

Energy Consumption Does Not Scale with Work Loads as Much as You Think

Most observers will agree that data center energy efficiency (and carbon and other emissions footprint) is an important issue, if for no oth...