Sunday, April 3, 2022

How Metaverse Drives Data Center, Connectivity Investments

As video content distribution has shaped global demand for inter-continental data transport and high-speed connections between major data centers, the metaverse will shape data center and connectivity network requirements. And the key words are “more” and “less.”


“Making the metaverse a reality will require significant advancements in network latency, symmetrical bandwidth and overall speed of networks,” says Dan Rabinovitsj, Meta VP for connectivity. 


The metaverse “will require innovations in fields like hybrid local and remote real-time rendering, video compression, edge computing, and cross-layer visibility, as well as spectrum advocacy, work on metaverse readiness of future connectivity and cellular standards, network optimizations, improved latency between devices and within radio access networks (RANs), and more,” he says. 


Already, experts predict Metaverse environments will require more data centers, more edge computing, more distributed computing, more collocation, more content distribution mechanisms, will require more power consumption and more cooling.


Eventually, fully-developed metaverses will require advances in chip technology as well. Beyond all that, blockchain will probably be necessary to support highly-decentralized value exchanges. And it is impossible to separate metaverse platforms and experiences from use of artificial intelligence, for business or consumer uses. 


source: iCapital Network 


If metaverses are built on persistent and immersive computing and tightly-integrated software stacks, platforms will be necessary. New developments in chip manufacturing also will be needed. 


For connectivity providers--especially internet service providers--far lower latency will be key. Today’s latency-sensitive applications such as video calling and cloud-based games have to meet a round-trip time latency of 75 milliseconds to 150 ms. Multi-player, complex games might require 30 ms latency. 


“A head-mounted mixed reality display, where graphics will have to be rendered on screen in response to where someone is focusing their eyes, things will need to move an order of magnitude faster: from single to low double digit ms,” says Rabinovitsj. 


Image rendering will require edge computing. “We envision a future where remote rendering over edge cloud, or some form of hybrid between local and remote rendering, plays a greater role,” he adds. “Enabling remote rendering will require both fixed and mobile networks to be rearchitected to create compute resources at a continuum of distances to end users.”


Bandwidth could increase by orders of magnitude over what is required to view 720p video on a standard smartphone screen. That might work with just 1.3 Mbps to 1.6 Mbps of downlink throughput. 


But a head-mounted display sitting just centimeters from the eyes required to display images at retina grade resolution will need to be many orders of magnitude larger, he notes. 


To be sure, most of what happens that is part of metaverse experiences rests on things that happen up the stack from computing and communications. 


source: Constellation Research


But we already can see how metaverse support will require changes in computing architecture and network capabilities.


No comments:

Will AI Fuel a Huge "Services into Products" Shift?

As content streaming has disrupted music, is disrupting video and television, so might AI potentially disrupt industry leaders ranging from ...