Wednesday, February 26, 2020

Nokia Network Slicing Available Summer of 2020

Nokia end-to-end network slicing functionality for 4G and 5G New Radio networks will be available in the summer of 2020. Nokia says it is the first supplier to offer this capability. 

The slicing capability can be deployed via a software upgrade to existing LTE and 5G non-standalone (NSA) networks and subsequently 5G standalone (SA) networks. 

At least in principle, network slicing could create a new type of wholesale or managed network capability, potentially allowing end user customers to control core networks as though it were their own managed network.

Think of it as “network as a service.” Granted, the nomenclature is difficult, since connectivity products have always been services. 

There are some important new business issues. What parameters can the slice customer actually control? Aside from the key performance indicators, related to quality of service expectations, what degree of control will a slice customer have over the slice parameters?

Does the slice customer put in a change request to the slice provider? Can the slice user make changes directly? And if so, to which parameters? In other words, how much control will slice customers have over on-the-fly changes to their private networks?


The Nokia network slicing solution provides sliced mobile broadband connectivity from device to radio, transport, core, all the way to applications in private and public networks and the cloud. It enables new mobile end-to-end services with logical connections, security, quality and traffic management with a seamless service continuity across 4G and 5G networks. 

Private wireless slicing also is supported. Nokia is already trialing live 4G/5G slicing use cases with customers powered by a unique Software Defined Network (SDN) radio slice controller as well as a transport slice controller. 



The trial includes a Nokia cloud packet core slice orchestrator to support network deployment automation as well as an SD-WAN software solution providing a managed 4G/5G network slice to private and public cloud services. Nokia assurance systems are used to verify per slice key performance indicators.  

But new questions will have to be asked and answered. VPN users do not actually have any control over the network, only the use of a private virtual tunnel through a network. A network slice, in principle, also adds quality of service and functionality guarantees. 

What must be worked out in practice are the degrees of end user programmability of such slices.

Can Fixed Networks Actually be Characterized Like Mobile Networks?

As nearly as I can tell, Huawei was the first entity to refer to fixed network infrastructure in five eras, in terms of use cases, not physical media, analog or digital modes, modulation technology or some other categorization that mimics the evolution of mobile networks

The difference is that while one can attribute certain lead apps or use cases to each mobility generation, there is a physical basis for the “G” nomenclature that is not present on the fixed networks. Each mobile generation was a discrete platform and network with distinct technological foundations. 

One can note the physical distinctions between voice switch generations, access media or logical architecture, and come up with some generations. Since the dawn of the internet protocol era, the physical network also has been separated logically from the applications that use networks and essentially abstracted. 

Some might characterize the fixed network eras using various optical platform developments, including the shift from BPON to GPON yp 10G PON to NG-POn2. You can decide whether this is useful or not. 

But European standards group ETSI has formed a new group which aims to specify the ”fifth generation of Fixed Network” (ETSI ISG F5G). 

At a practical level, the effort seems to address three main issues: full-fiber connections, enhanced fixed broadband and guaranteed reliable experience. But most of the effort seems to focus on how applications drive the need for network performance, arguably something all the other standards groups already essentially are working to ensure. 

“The ETSI ISG F5G aims at studying the fixed-network evolution required to match and further enhance the benefits that 5G has brought to mobile networks and communications.” ETSI says. 

Some of us might argue this is largely a marketing exercise, similar to the phrase “from fiber-to-home to fiber-to-everywhere.”

Tuesday, February 25, 2020

Encouraging Pacific Islanders



No, nothing directly to do with the internet, communications, mobility or capacity. Just a nice music video reminding Pacific Islanders, wherever they are, that the census is coming. 

Enterprises Say They Need Help with 5G Use Cases

It might not be clear which entities in the 5G value chain will be supplying expertise on 5G apps and use cases, but a study by Accenture suggests there is a substantial market for such advice. About 72 percent of respondents to an Accenture survey indicated “they need help to imagine the future possibilities for connected solutions with 5G.”

As you might guess, “software and services companies” and “cloud businesses” are viewed as the sorts of firms most likely to provide that help. 

Also, the percentage of businesses expecting to develop 5G applications in-house has dropped over the last year, from 23 percent in the prior-year survey to 14 percent this year, Accenture says. 

As always, though connectivity suppliers are among the most-likely sources of help, there is concern about industry domain knowledge. 


The survey included responses from more than 2,600 business and technology decision makers across 12 industry sectors in Europe, North America and Asia-Pacific. 


Sunday, February 23, 2020

Telcos are Buying AI Functionality, if Not AI Directly

Artificial intelligence is being adopted by communications networks in many subtle ways, even if AI is not a product but a capability.

Businesses and consumers do not buy “AI” any more than they buy calories in a direct sense. Instead, AI is a capability of some other product a service provider, enterprise or consumer purchases. 

That applies in telecom as much anywhere else. Some note that telecom networks can use AI for network optimization, preventive maintenance, virtual assistants and robotic process automation.  AI also plays a role in self optimizing networks (SONs), software defined networks and Network Function Virtualization as well, which are basic network principles in the 5G era. 

IDC has argued that 64 percent  of network operators are investing in AI systems to improve their infrastructure, for example. 

Some popular AI use cases in telecom include:
  • ZeroStack’s ZBrain Cloud Management, which analyzes private cloud telemetry storage and use for improved capacity planning, upgrades and general management
  • Aria Networks, an AI-based network optimization solution that counts a growing number of Tier 1 telecom companies as customers
  • Sedona Systems’ NetFusion, which optimizes the routing of traffic and speed delivery of 5G-enabled services like AR/VR
  • Nokia launched its own machine learning-based AVA platform, a cloud-based network management solution to better manage capacity planning, and to predict service degradations on cell sites up to seven days in advance.

AI functions almost always are used for pattern recognition, to understand typical trends or behaviors. In a consumer context that often is used to monitor customer financial transactions, to  spot anomalies in an account’s spending data that could represent potentially fraudulent behavior. Automated financial advisor services use AI to provide recommendations. 

In a manufacturing or energy industry use case, supply chain optimization, automated detection of defects during production and energy forecasting are use cases based on pattern recognition. 

Prediction, such as forecasting energy consumption, is another common use for AI. Classification or image recognition are other use cases, as when law enforcement agencies use facial recognition. Health and life science users might use AI to help process data from past case notes, biomedical imaging and health monitors to use for  predictive diagnostics.

Consumer speech to text is a frequent AI use case as well. E-commerce engines often use AI in a cognitive search mode to generate personalized recommendations to online shoppers. 

A related use case is natural language interaction, where a software application generates a report on sales revenue predictions without having to run the reports manually, or natural language generation, where a user might hear summaries of everything that has been analyzed from a large document collection. 

Communications networks might use AI to route traffic, optimize server loads or predict future capacity demand. Also, any industry relying on call centers for customer interaction and support use AI to support chatbots. 

Retailers use AI for personalized shopping experiences and customized recommendations.

How Long Might it Take for Private 4G and 5G to Demonstrate Value?

As tantalizing as projected productivity advances using 5G and edge computing might be, it is worthwhile to recall that past eras of automation and computerization often have failed to move the needle where it comes to measurable productivity gains. 

Nokia points out that the era of applying information and communications technology to manufacturing managed to produce productivity gains of less than one percent. The implication is that advanced 4G and 5G private networks will produce greater gains, across a range of use cases including video surveillance and analytics, immersive experiences, smart stadiums, eHealth, machine remote control, cloud robotics and process automation. 


It has been hard to quantify the benefits of computerization, digitalization or applied communications and information technology for many decades. The productivity paradox suggests that information technology or communications investments do not always immediately translate into effective productivity results. 


This productivity paradox was apparent for much of the 1980s and 1990s, when one might have struggled to identify clear evidence of productivity gains from a rather massive investment in information technology.

Some would say the uncertainty covers a wider span of time, dating back to the 1970s and including even the “Internet” years from 2000 to the present.

The point is that it has in the past taken as long as 15 years for technology investments to produce measurable gains

Computing power in the U.S. economy increased by more than two orders of magnitude between 1970 and 1990, for example, yet productivity, especially in the service sector, stagnated).

And though it seems counter-intuitive, even the Internet has not clearly affected economy-wide productivity. Some might argue that is because we are not measuring properly. It is hard to assign a value to activities that have no incremental cost, such as listening to a streamed song instead of buying a compact disc. It might also be argued that benefits accrue, but only over longer periods of time

The fourth industrial revolution could well take quite some time to produce measurable benefits as well. Of course, we might be wrong about the productivity paradox. The fourth industrial revolution might achieve something we have not seen before. Still, history suggests we would do well to temper expectations. 

Even broadband impact on productivity is a matter of debate, for example. So enterprise managers might want to be patient. It might well take 15 years before the quantifiable payoff from the “fourth industrial revolution” is obvious. 

As with so much other technology investment, we are likely to be initially disappointed.

Can We Build IoT Devices Without Batteries? And What Would That Mean?

Can IoT, Wi-Fi, low power wide area or maybe someday mobile IoT devices operate with such low battery draw that battery life literally can reach a decade? And how can that be done? Perhaps even more futuristically, can IoT devices be designed to operate without any power sources of their own?

The issue is that internet of things devices might often have to work at remote locations, over wide areas that are inaccessible or inconveniently inaccessible.

In other cases it is not so much sensor site accessibility but the sheer number of devices that must be supported, and the chores of replacing batteries therefore onerous and expensive.

Are batteryless IoT devices--able to operate and communicate without any internal batteries--possible? 

To state the issue as starkly as possible, can IoT networks and other devices literally extract power they need from “thin air,” without having any power sources of their own? For example, can a sensor on a leaf, on a farm, be operated without its own power supply?


Or could remote sensors actually communicate using unmanned aerial vehicles, over wide areas, without having their own power sources? 


To state the matter in a way that is more likely to happen as a commercial reality, can IoT, Wi-Fi, low power wide area or maybe someday mobile IoT devices operate with such low battery draw that battery life literally can reach a decade?

And what sorts of important and useful applications could be created, if it fact low-cost and long-lived devices could be scattered about, needing a battery change only once a decade, if that.

Electrical engineers and researchers have insisted for some years. Using ambient backscatter, existing wireless signals into both a source of power and a communication medium. 

Backscattering  enables two battery-free devices to communicate by backscattering existing wireless signals. Backscatter communication is orders of magnitude more power-efficient than traditional radio communication. Further, since it leverages the ambient RF signals that are already around us, it does not require a dedicated power infrastructure as in RFID.

A backscatter system known as Hitchhike was shown in 2016, for example, by a Stanford University research team led by Sachin Katti, an associate professor of electrical engineering and of computer science, and Pengyu Zhang, then a postdoctoral researcher in Katti’s lab.

In but the latest move, electrical engineers at the University of California San Diego have developed a new ultra-low-power Wi-Fi radio  integrated in a small chip for the Internet of things devices, and using  backscattering.


With sizes no bigger than rice grains, the device consumes only 28 microwatts of power, which is 5,000 times lower than standard Wi-Fi radios. At the same time, this chip can transmit data as far as 21 meters at a rate of 2 Mbps, which is enough for decent quality video.

The invention is based on a technique called backscattering. The transmitter does not generate its own signal, but takes the incoming signals from the nearby devices (like a smartphone) or Wi-Fi access point, modifies the signals and encodes its own data onto them, and then reflects the new signals onto a different Wi-Fi channel to another device or access point. This approach requires much less energy and gives electronics manufacturers much more flexibility.



Where, and How Much, Might Generative AI Displace Search?

Some observers point out that generative artificial intelligence poses some risk for operators of search engines, as both search and GenAI s...