Sunday, March 4, 2018

In the Next Eras of Communications, All Access will be Wireless (Untethered)

Network architectures and communication protocols are not the primary concern of most people who work in and around the telecommunications industry, but architectures always reflect the dominant business models in each era.

One way of stating the principle is that networks always have been optimized for the lead application and revenue driver, in each era. Broadcast TV and radio; cable TV and satellite video networks as well as telecom networks were cases of “form follows function:” each network was designed to optimally support one key app.

We can argue that all began to change when next-generation networks were envisioned as “all digital,” allowing one platform to support any media type. That is generally--if not entirely--true.

Cabled networks are designed to connect places or locations; mobile networks are designed to connect people or devices.

But we are likely headed for another evolution of network architectures, precisely to support new potential business models. Broadly, all networks these days are computing networks.

So if computing application requirements change (along with revenue drivers), then architectures will evolve to supply that demand. Mainframe, minicomputer, personal computing, cloud computing and mobile computing all changed networks.

If the next evolution of demand for computing--and networks to support computing--centers on sensors, not people, and if important categories of such computing must feature ultra-low latency, then centralized cloud computing will not work.

A new network and computing architecture will have to be created “at the edge” to support local processing. As always, “what computing” has to be supported changes “where” computing happens.


High-frequency trading, virtual reality gaming, autonomous vehicle, augmented reality gaming,  4K video and remote medical apps are among conceivable apps that require such low latency that centralized cloud computing will not work.

Ultra-low latency, even with ultra-low-latency access networks, might require edge computing as well, as the time to reach remote computing resources is too lengthy.

That “ultra-low latency” and “gigabit” access speeds are primary characteristics of coming 5G networks can obscure the larger implications. All networking is moving to “ultra-low latency” and gigabit speeds.

The importance of 5G is not just that it is the next generation of mobile communications. The larger point is that 5G is part of the re-architecting of networks in general towards ultra-low latency, high performance support where the core network resources are accessed using untethered means.

That will be true for consumer or enterprise access. The whole point of access networks will be to support ubiquitous access to core network resources over “long wires and short wireless connections.”

In that sense, “5G” is shorthand. It represents not only a particular mobile solution for untethered access, but a generic shorthand for all ultra-low latency untethered access networks as well.

That is imprecise, of course. But one stumbles over longer phrases such as next-generation networks, which in turn require explanation. The term “5G” (although a specific mobile solution) necessarily highlights the coming change in network architectures and business models.

For the first time, some important revenue-generating applications will require such low latency that computing resources (cloud data centers) must be moved to the edge of the network.

Traditional web apps, voice and messaging still will be able to use centralized cloud data centers. But many of the hoped-for new apps will have to move to the edge of networks. And no matter what the setting (consumer or business; indoors or outdoors; home, small business or large campus), access will be on an untethered basis.

Mobile, Wi-Fi, or faster protocols such as Wi-Gig will be used. But all will be untethered (no wires).

No comments:

Will AI Actually Boost Productivity and Consumer Demand? Maybe Not

A recent report by PwC suggests artificial intelligence will generate $15.7 trillion in economic impact to 2030. Most of us, reading, seein...