Thursday, August 10, 2023

AI Will Drive Data Center Capabilities: How Much Seems the Only Real Issue

Though artificial intelligence and generative AI training and inference operations are widely expected to drive data center requirements for processing power, storage and energy consumption, it seems perhaps unlikely that edge computing will get a similar boost, principally because AI inference operations and training are not latency-dependent. 


And the value of edge computing is latency reduction as well as bandwidth avoidance. And while it still makes economic sense to store frequently-requested content at the edge (content delivery use cases), AI operations will likely not be so routinized that this adds too much value. 


Operations requiring large language model access likely will still need to happen at larger data centers, for reasons of access to processing and storage resources. Think about processing to train AI models for self-driving cars, fraud detection, and other applications that require the analysis of massive datasets.


To be sure, support of self-driving cars also involves perhaps-stringent latency requirements. The problem is simply that the requirement for high-performance computing and access to data stores is more crucial for performance. So processing is likely to be located “onboard.” Again, the key observation is the split between on-device and remote data center functions. Edge might not play much of a role. 


The debate will likely be over the value of regional data centers, which some might consider “edge,” but others will say is a traditional large data center function. 


Operations that can be conducted on a device likewise will not benefit much, if at all, from edge compute capabilities. Think real-time language translation, facial recognition, and other applications that require quick responses.


And Digital Bridge CEO Marc Ganzi believes AI will mean a new or additional market about the size of the whole public cloud computing market, eventually. 


If public cloud now represents about 13 gigawatts of capacity, AI might eventually require 38 gigawatts, says Ganzi. 


The whole global data center installed base might represent something on the order of 700 gigawatts, according to IDC. Other estimates by the Uptime Institute suggest capacity is on the order of 180 GW. 


According to a report by Synergy Research Group, the global public cloud computing industry now represents 66.8 gigawatts (GW) of capacity. 


So AI should have a significant impact on both cloud computing capacity and data center requirements over time.


No comments:

Will AI Disrupt Non-Tangible Products and Industries as Much as the Internet Did?

Most digital and non-tangible product markets were disrupted by the internet, and might be further disrupted by artificial intelligence as w...