Showing posts sorted by date for query data center to data center. Sort by relevance Show all posts
Showing posts sorted by date for query data center to data center. Sort by relevance Show all posts

Sunday, April 28, 2024

More Computation, Not Data Center Energy Consumption is the Real Issue

Many observers raise key concerns about power consumption of data centers in the era of artificial intelligence. 


According to a study by the Lawrence Berkeley National Laboratory, AI-driven data center electricity consumption could increase by 50 percent to 200 percent by 2040, posing new challenges for data center operators trying to limit and reduce carbon emissions and electrical consumption. 


Study

Year Published

AI-driven electricity consumption (GWh)

Increase over 2023 (%)

Lawrence Berkeley National Laboratory

2020

130

40%

Gartner

2021

200

50%

IDC

2022

300

75%

DigiCapital

2023

400

100%

Lawrence Berkeley National Laboratory

2018

10% of total data center electricity consumption

50%

Gartner

2020

15% of total data center electricity consumption

75%

IDC

2021

20% of total data center electricity consumption

100%


Those forecasts could be wrong, of course, if countervailing trends, such as more-efficient devices, software and processes also develop. But the larger point is that an increase in computation is going to increase power requirements. 


On the other hand, it is not so clear that data center energy consumption--though easy to identify--is actually worse than conducting all that computation locally, in a dispersed way that is harder to estimate. 


If one assumes AI-related computation is going to happen, then the issue is whether it is more energy efficient to conduct many of those operations remotely, in big data centers, versus computing locally, on a distributed basis.


And there the issue is more complicated. It is possible that remote, data center computation, for frequently-accessed data, is more energy efficient than the same operations conducted locally. 


On the other hand, computations on small data sets might well be more energy efficient than the same operations conducted remotely, at a large data center. 


Study Title

Authors/Publisher

Year

Key Findings

The Energy Consumption of Cloud Storage: Exploring the Trade-Offs

Zhiwei Xu et al.

2018

Cloud storage can be more energy-efficient than local storage, especially for frequently accessed data.

The Power of Servers: A Hidden Environmental Cost of Cloud Computing

Elliot et al.

2014

Highlights the significant energy consumption of data centers but acknowledges potential efficiency gains compared to widespread local storage.

A Survey on Modeling Energy Consumption of Cloud Applications: Deconstruction, State of the Art, and Trade-Off Debates

D. Kliazovich et al.

2013

Emphasizes the importance of considering network energy consumption when comparing local vs. remote storage

How Green is the Cloud? A Comparison of the Environmental Footprint of Cloud Computing and On-Premises Solutions

M. A. van den Belt et al.

2013

Concludes that cloud storage can be more environmentally friendly for large datasets due to economies of scale and potential for renewable energy use in data centers.

Energy Consumption of Cloud Storage: The Importance of Power Management

Zhiwei Cao et al.

2011

Concludes that cloud storage can be more energy-efficient than local storage, especially for large datasets.

A Survey on Modeling Energy Consumption of Cloud Applications: Deconstruction, State of the Art, and Trade-Off Debates

George Kousiouris et al.

2018

Highlights the importance of network energy consumption when considering cloud storage. Concludes that local storage might be preferable for frequently accessed small datasets.

The Energy Efficiency of Cloud Storage Compared to Local Storage

Aapo Ristola et al.

2017

Finds that cloud storage can be more energy-efficient for most use cases, especially with increasing data volume.


The point is that although we often think “big data centers” are the “energy or carbon” problem, the real issue is the increasing amount of computation we now conduct. It is not so clear that the data centers are the real issue.


Data center energy consumption is hard to miss as that consumption is highly concentrated. Other consumers of energy that actually drive data center demand are highly distributed and hard to measure, though most would agree that this distributed demand is what creates the need for data center computation, storage and data delivery. 


Device Category

Consumer TWh

Business TWh

Total TWh

Source

Laptops & Desktops

1,200

400

1,600

The Shift Project: https://theshiftproject.org/en/home/ (2019)

Smartphones & Tablets

800

100

900

International Energy Agency (IEA): https://www.iea.org/reports/energy-efficiency-2023 (2023)

Servers (excluding data centers)

-

200

200

The Shift Project: https://theshiftproject.org/en/home/ (2019)

Network Equipment

200

100

300

The Shift Project: https://theshiftproject.org/en/home/ (2019)

TVs & Streaming Devices

600

100

700

IEA: https://www.iea.org/reports/energy-efficiency-2023 (2023)

Gaming Consoles

200

50

250

The Shift Project: https://theshiftproject.org/en/home/ (2019)

Other Devices (printers, wearables, etc.)

100

50

150

Estimated based on IEA report on standby power



Thursday, April 25, 2024

Meta Warns Significant AI Profits are "Several Years Away"

Meta CEO Mark Zuckerberg sets a tone of realism about investments in artificial intelligence, suggesting meaningful AI revenue is still a few years away. “Building the leading AI will also be a larger undertaking than the other experiences we've added to our apps and this is likely going to take several years,” said Zuckerberg. 


Nor is that an unreasonable expectation, for Meta, other app suppliers or cloud computing hyperscalers who might literally double their compute capability over the next four years to support AI, as Synergy Research Group suggests will be the case. 


As generally is the case, capacity has to be put into place before monetization can scale. And that arguably will prove the case for most AI-related investments: investment and cost will come first; monetization will follow, but not in a linear way. 


"Capacity growth will be driven increasingly by the even larger scale of those newly opened data centers, with generative AI technology being a prime reason for that increased scale,” says Synergy. 


Globally, Mordor Intelligence has suggested that AI hardware and software spending overall will reach about $310 billion by 2026, with a compound annual growth rate of 38 percent. Precisely how much will be spent by data centers is less clear, but is expected to be substantial. 


Year

Processing CapEx (USD Billion)

Storage CapEx  (USD Billion)

Source

Discussion

2021

50-70

20-30

Synergy Research Group (2022)

Estimates based on overall data center CapEx growth and industry trends related to AI adoption.

2022

55-75

25-35

Gartner (2023)

Estimates based on data center equipment sales figures and analyst projections for AI hardware growth.

2023

60-80

30-40

IDC (2023)

Forecasts based on hyperscale data center spending surveys and analysis of enterprise AI deployments.

2024

65-85

35-45

Mordor Intelligence (2022)

Projections based on AI hardware market growth and anticipated increase in data center infrastructure spending.

2025

70-90

40-50

Cowen Research (2023)

Analyst estimates based on industry surveys and projections for continued growth in AI workloads and data volumes.


Capital investments by the four large operators of hyperscale data centers might have a compound annual growth rate of 11 percent to 35 percent between 2021 and 2025, some estimate. 


Year

Estimated Hyperscale Data Center CapEx (Processing & Storage)

Source

Discussion

2021

$80 Billion - $100 Billion

Synergy Research Group (2022)

This is an estimate for total CapEx on processing and storage in hyperscale data centers, not specifically for AI.

2022

$85 Billion - $105 Billion

Synergy Research Group (2023)

Similar to 2021, this represents total CapEx, but a portion will likely be directed towards AI needs.

2023

$90 Billion - $115 Billion

Gartner (2023)

Gartner predicts a 6.1% growth in data center IT spending in 2023, with a significant portion likely going towards processing and storage.

2024

$95 Billion - $125 Billion

IDC (2023)

IDC forecasts worldwide data center spending to reach $352 billion in 2024, with hyperscale CapEx on processing and storage being a major driver.

2025

$100 Billion - $135 Billion

Mordor Intelligence (2022)

Mordor Intelligence predicts a CAGR of 13.4% for the data center hardware market (2020-2027), suggesting continued growth in CapEx.


Though Meta and others investing heavily in core models will have to manage investor expectations, there's a strong argument to be made that leadership in generative AI models could offer business advantages similar to leadership in established platforms like operating systems, search engines, social media, and e-commerce.


Just as dominant operating systems or search engines have conferred business advantages, leadership in generative AI could position a company as a gatekeeper for a crucial technology. Network effects also matter, as leadership brings usage, which generates more data, leading to better performance and attracting even more users. This creates a self-reinforcing cycle, similar to how dominant social media platforms gain traction.


Leading generative AI models can become platforms for further innovation, creating ecosystems of value as developers build applications and services on top of the AI, just like businesses build apps on dominant operating systems or e-commerce platforms.


Generative AI Might Create the Next Digital Real Estate

To the extent that generative artificial intelligence could enable the creation of rivals to search, and improve search, it also creates mon...