Thursday, October 26, 2023

6G Should "Enable," Not "Create" New Apps and Use Cases

After our experiences with 3G, 4G and now 5G, perhaps we ought to be more circumspect about all the positively amazing new experiences that will actually develop when we get to 6G. 


Already, observers offer examples of new applications and services that could be enabled by 6G:


  • Real-time holographic video conferencing

  • Augmented reality experiences

  • Self-driving cars

  • Remote surgery

  • Mobile broadband in rural areas

  • IoT connectivity in dense urban environments


None of that will startle: those were raised as apps that could be supported by 5G, and might yet emerge. 


And more to the point, despite the expected improvements in latency performance and bandwidth, maybe we should be cautious about claiming too much for the ways artificial intelligence or virtual reality will be embedded into the core network. 


No doubt AI will be used to support the core network and its processes. But that’s different from possible efforts to embed AI or AR or VR as customer-facing features of the networks, as some might propose. 


Beyond making the network operate as efficiently as possible, offering the best latency performance and bandwidth support we can reasonably develop in the next generation of networks, we might remain skeptical of efforts to claim or support network features that go beyond making the network as liquid as possible; as dynamic as possible; as flexible as possible. 


An energy-efficient network, using an on-demand architecture featuring low latency capabilities and no restrictions on bandwidth, using virtual mechanisms, is a reasonable goal. 


Beyond that, what we probably still need is a permissionless development environment, where app software does not have to assume much other than the existence of the low-latency, high-bandwidth connectivity. 


In other words, perhaps all we want is a network that is as open as possible, as virtualized as possible, as flexible and dynamic as possible, capable of supporting any conceivable application but without embedding any of that inside the core network. 


But some will try to create capabilities that are embedded into the core network, no doubt. That’s one way of attempting to profit from apps using the network.


No comments:

Have LLMs Hit an Improvement Wall, or Not?

Some might argue it is way too early to worry about a slowdown in large language model performance improvement rates . But some already voic...