Sunday, February 23, 2020

Telcos are Buying AI Functionality, if Not AI Directly

Artificial intelligence is being adopted by communications networks in many subtle ways, even if AI is not a product but a capability.

Businesses and consumers do not buy “AI” any more than they buy calories in a direct sense. Instead, AI is a capability of some other product a service provider, enterprise or consumer purchases. 

That applies in telecom as much anywhere else. Some note that telecom networks can use AI for network optimization, preventive maintenance, virtual assistants and robotic process automation.  AI also plays a role in self optimizing networks (SONs), software defined networks and Network Function Virtualization as well, which are basic network principles in the 5G era. 

IDC has argued that 64 percent  of network operators are investing in AI systems to improve their infrastructure, for example. 

Some popular AI use cases in telecom include:
  • ZeroStack’s ZBrain Cloud Management, which analyzes private cloud telemetry storage and use for improved capacity planning, upgrades and general management
  • Aria Networks, an AI-based network optimization solution that counts a growing number of Tier 1 telecom companies as customers
  • Sedona Systems’ NetFusion, which optimizes the routing of traffic and speed delivery of 5G-enabled services like AR/VR
  • Nokia launched its own machine learning-based AVA platform, a cloud-based network management solution to better manage capacity planning, and to predict service degradations on cell sites up to seven days in advance.

AI functions almost always are used for pattern recognition, to understand typical trends or behaviors. In a consumer context that often is used to monitor customer financial transactions, to  spot anomalies in an account’s spending data that could represent potentially fraudulent behavior. Automated financial advisor services use AI to provide recommendations. 

In a manufacturing or energy industry use case, supply chain optimization, automated detection of defects during production and energy forecasting are use cases based on pattern recognition. 

Prediction, such as forecasting energy consumption, is another common use for AI. Classification or image recognition are other use cases, as when law enforcement agencies use facial recognition. Health and life science users might use AI to help process data from past case notes, biomedical imaging and health monitors to use for  predictive diagnostics.

Consumer speech to text is a frequent AI use case as well. E-commerce engines often use AI in a cognitive search mode to generate personalized recommendations to online shoppers. 

A related use case is natural language interaction, where a software application generates a report on sales revenue predictions without having to run the reports manually, or natural language generation, where a user might hear summaries of everything that has been analyzed from a large document collection. 

Communications networks might use AI to route traffic, optimize server loads or predict future capacity demand. Also, any industry relying on call centers for customer interaction and support use AI to support chatbots. 

Retailers use AI for personalized shopping experiences and customized recommendations.

How Long Might it Take for Private 4G and 5G to Demonstrate Value?

As tantalizing as projected productivity advances using 5G and edge computing might be, it is worthwhile to recall that past eras of automation and computerization often have failed to move the needle where it comes to measurable productivity gains. 

Nokia points out that the era of applying information and communications technology to manufacturing managed to produce productivity gains of less than one percent. The implication is that advanced 4G and 5G private networks will produce greater gains, across a range of use cases including video surveillance and analytics, immersive experiences, smart stadiums, eHealth, machine remote control, cloud robotics and process automation. 


It has been hard to quantify the benefits of computerization, digitalization or applied communications and information technology for many decades. The productivity paradox suggests that information technology or communications investments do not always immediately translate into effective productivity results. 


This productivity paradox was apparent for much of the 1980s and 1990s, when one might have struggled to identify clear evidence of productivity gains from a rather massive investment in information technology.

Some would say the uncertainty covers a wider span of time, dating back to the 1970s and including even the “Internet” years from 2000 to the present.

The point is that it has in the past taken as long as 15 years for technology investments to produce measurable gains

Computing power in the U.S. economy increased by more than two orders of magnitude between 1970 and 1990, for example, yet productivity, especially in the service sector, stagnated).

And though it seems counter-intuitive, even the Internet has not clearly affected economy-wide productivity. Some might argue that is because we are not measuring properly. It is hard to assign a value to activities that have no incremental cost, such as listening to a streamed song instead of buying a compact disc. It might also be argued that benefits accrue, but only over longer periods of time

The fourth industrial revolution could well take quite some time to produce measurable benefits as well. Of course, we might be wrong about the productivity paradox. The fourth industrial revolution might achieve something we have not seen before. Still, history suggests we would do well to temper expectations. 

Even broadband impact on productivity is a matter of debate, for example. So enterprise managers might want to be patient. It might well take 15 years before the quantifiable payoff from the “fourth industrial revolution” is obvious. 

As with so much other technology investment, we are likely to be initially disappointed.

Can We Build IoT Devices Without Batteries? And What Would That Mean?

Can IoT, Wi-Fi, low power wide area or maybe someday mobile IoT devices operate with such low battery draw that battery life literally can reach a decade? And how can that be done? Perhaps even more futuristically, can IoT devices be designed to operate without any power sources of their own?

The issue is that internet of things devices might often have to work at remote locations, over wide areas that are inaccessible or inconveniently inaccessible.

In other cases it is not so much sensor site accessibility but the sheer number of devices that must be supported, and the chores of replacing batteries therefore onerous and expensive.

Are batteryless IoT devices--able to operate and communicate without any internal batteries--possible? 

To state the issue as starkly as possible, can IoT networks and other devices literally extract power they need from “thin air,” without having any power sources of their own? For example, can a sensor on a leaf, on a farm, be operated without its own power supply?


Or could remote sensors actually communicate using unmanned aerial vehicles, over wide areas, without having their own power sources? 


To state the matter in a way that is more likely to happen as a commercial reality, can IoT, Wi-Fi, low power wide area or maybe someday mobile IoT devices operate with such low battery draw that battery life literally can reach a decade?

And what sorts of important and useful applications could be created, if it fact low-cost and long-lived devices could be scattered about, needing a battery change only once a decade, if that.

Electrical engineers and researchers have insisted for some years. Using ambient backscatter, existing wireless signals into both a source of power and a communication medium. 

Backscattering  enables two battery-free devices to communicate by backscattering existing wireless signals. Backscatter communication is orders of magnitude more power-efficient than traditional radio communication. Further, since it leverages the ambient RF signals that are already around us, it does not require a dedicated power infrastructure as in RFID.

A backscatter system known as Hitchhike was shown in 2016, for example, by a Stanford University research team led by Sachin Katti, an associate professor of electrical engineering and of computer science, and Pengyu Zhang, then a postdoctoral researcher in Katti’s lab.

In but the latest move, electrical engineers at the University of California San Diego have developed a new ultra-low-power Wi-Fi radio  integrated in a small chip for the Internet of things devices, and using  backscattering.


With sizes no bigger than rice grains, the device consumes only 28 microwatts of power, which is 5,000 times lower than standard Wi-Fi radios. At the same time, this chip can transmit data as far as 21 meters at a rate of 2 Mbps, which is enough for decent quality video.

The invention is based on a technique called backscattering. The transmitter does not generate its own signal, but takes the incoming signals from the nearby devices (like a smartphone) or Wi-Fi access point, modifies the signals and encodes its own data onto them, and then reflects the new signals onto a different Wi-Fi channel to another device or access point. This approach requires much less energy and gives electronics manufacturers much more flexibility.



Friday, February 21, 2020

New Platforms Will Help, But Rural-Urban Divide Still Will Exist

Though some will criticize the changes, a new Federal Communications Commission look at where internet access was in 2018 suggests coverage and speeds available to underserved citizens is improving.

Such changes never will be fast enough to satisfy some, and it might never be possible to completely eliminate the urban-rural difference in number of suppliers, typical speeds or absolute speed. Nor is it possible to dismiss progress.

From December 2016 to December 2018, the number of U.S. residents without any options for at least 250/25 Mbps fixed terrestrial broadband service dropped by 74 percent, from 181.7 million to 47 million, the FCC notes.

The number of residents with no options for at least 25/3 Mbps fixed terrestrial broadband service fell by 30 percent, from 26.1 million to 18.3 million.  

The data also showed an increase in competition from December 2016 to December 2018, with the number of residents  enjoying more than two options for 25/3 Mbps fixed terrestrial broadband service increasing by 52 percent, from 45.9 million to 69.8 million.  

Moreover, the number of rural residents with two or more options for 25/3 Mbps fixed terrestrial broadband service increased by 52 percent, from 14.4 million to 22 million.  

New platforms are coming, though, including low earth orbit satellite constellations, fixed wireless, and much more unlicensed and shared spectrum. 

The new HAPS Alliance--including SoftBank’s HAPS Mobile, Alphabet's Loon, AeroVironment, Airbus Defence and Space, Bharti Airtel, China Telecom, Deutsche Telekom, Ericsson, Nokia SoftBank and Telefónica--aims to promote the use of high altitude vehicles for internet access. 

Considered in light of the emergence of low earth orbit satellite constellations, 5G fixed wireless and use of growing amounts of unlicensed and shared spectrum, HAPs shows growing potential use of wireless access platforms for internet access, especially in areas where traditional platforms cannot generate a reasonable financial return. 

Note the involvement of major app developers, mobile service providers, traditional telecom infrastructure providers and aerospace firms. As with the Telecom InfraProject, new and incumbent entities are working together to reduce the costs of communications infrastructure.  

Among the immediate activities is “global harmonization of HAPS spectrum, including the adoption, improvement and acceleration of global spectrum standardization for High Altitude IMT Base Stations within the International Telecommunications Union (ITU),” the group says. 

Some early indications are that--as always in the past--the usable bandwidth of HAPs will be substantially less than available on terrestrial networks. Some early work by the International Telecommunications Union suggests data rates per user will depend on the number of simultaneous users.


A single HAPs platform might have a total throughput of less than 5 Gbps, supporting 2,000 simultaneous users, or perhaps 30 Gbps supporting 12,800 simultaneous users. 

That works out to about 2.38 Mbps per user, when there are 2,000 simultaneous users. HAPs units transmitting over wider areas might only be able to support a couple hundred kilobits per second when 12,800 simultaneous users are active

Some early suggestions for dedicated HAPS bandwidth use the 6.4 GHz to 6.6 GHz band. In the millimeter ranges a few different bands might be used, including 28 GHz, 31 GHz, 47 GHz or 48 GHz. 

Some bandwidth always is preferable to no bandwidth, and that is the primary appeal of the emergence untethered access platforms. In perhaps a minority of cases, untethered bandwidth might approach that of cabled network alternatives. That might occur most often when millimeter wave spectrum is used in a small cell deployment to boost capacity in a dense user, urban location. 

Most of the time, untethered bandwidth supplied by some wireless platform will provide access where it is not otherwise possible, but also not at speeds that approach cabled network performance. 

One way of illustrating the dramatic impact of coming shared spectrum, unlicensed and millimeter wave spectrum is compare those new sources with all existing mobile spectrum. 



Thursday, February 20, 2020

"Rural Areas Underserved" is Evergreen, but Might Not Always be a Problem

“Rural areas underserved by broadband networks” is an evergreen story: it never goes away. No matter how much improvement there is in the coverage, speed of networks or cost, the continual network improvements in urban areas always will mean there is a gap between rural and urban networks. 


Consider this graph of user density in the United Kingdom, which shows that 90 percent of U.K. users live on just 40 percent of the land area, with 60 percent of people living on just 10 percent of the land surface. Since terrestrial cabled network cost is directly proportional to density, networks will cost least where density is highest, most where density is lowest. 


Population density is even more skewed in the U.S. market, where about 63 percent of people live on just 3.5 percent of the land area, according to the U.S. Census Bureau. Most of the access network cost problem lies in the last couple of percent of U.S. locations. 


The good news might be that the amount of bandwidth available in rural areas might soon be reasonable enough to support customer experience for virtually all apps, even if a gap with urban areas persists. The other issue is demand. Even when it is available, many rural residents do not see the need for broadband or faster internet speeds.


A survey of 194 small telcos that are members of the NTCA rural broadband association is instructive. You might be surprised to learn that 23 percent of all connections made available by these rural service providers in 2018 offered at least a 1,000-Mbps connection. 


Another 34 percent of connections offer speeds from 100 Mbps to about 999 Mbps. In other words, 57 percent of available connections operate at 100 Mbps or faster.

As typically is the case, that does not mean most customers buy the fastest services. They do not. Actual buying clusters in the range between 4 Mbps and 100 Mbps minimums.

Wednesday, February 19, 2020

Maybe 3 Mobile Service Providers Really are Enough to Produce Consumer Welfare Gains

A study by GSMA Intelligence might be useful, in terms of looking at firm sustainability and competition, after the merger or T-Mobile US and Sprint. A key argument in the debate over the merger of T-Mobile US and Sprint was its impact on competition and consumer welfare (prices being the measure). 


On one hand, some proponents argued that the merger would allow T-Mobile to compete and invest on more-equal terms with AT&T and Verizon.


Opponents argued that consumer welfare would essentially suffer, as the lessened competition would lead to less pricing pressure. It might be noted that virtually every equity analyst I have encountered believed the merger would lead to less pricing pressure in the U.S. mobile market. 


But the GSMA Intelligence data are mixed. Since 2013, in European markets, average mobile internet prices per megabyte have been the same, in both four-provider and three-provider markets. 


In other words, it is not clear that three-provider markets lead to higher prices, or that four-provider markets lead to lower prices. Also, it is not clear that competition, per se, has had the largest impact on falling prices. The GSMA study shows that per-megabyte prices were higher in the four-provider countries to begin with. In 2012, the three-provider markets had lower per-megabyte prices. 




The possibility therefore exists that something other than the number of competitors accounts for average prices. 


Likewise, average revenue per user fell at nearly the same amounts between 2011 and 2018, whether there were three or four competitors. 


Also, there seems to be very little difference between bandwidth increases in markets lead by three or four contestants. 


The GSMA Intelligence study suggests it is likely that profit margins will be higher for the three remaining national leaders, as profit margins in European “three-provider” markets have been consistently higher than in “four-provider” markets. That is what one would expect. 


Perhaps some of these same trends will be seen in the U.S. market, once the T-Mobile merger with Sprint has time to be consolidated.

Huge Market Share Loss for Incumbents is the Actual Intent of Deregulation

Of all changes--large and small--that have occurred in the telecom business since deregulation and the emergence of the internet, none are perhaps more striking than the lost market share incumbents have experienced. And, to be sure, that is the expected outcome of telecom deregulation.

Looking only at internet access, and only at the U.S. market, consider that incumbent telcos have fractional market share in the fixed network segment of the business. For 20 years, incumbent telcos have steadily lost market share to cable companies. AT&T and Verizon, for example, have only about 23 percent share, according to Mobile Experts. 

According to Leichtman Research, cable companies have 67 percent of the installed base, telcos about 33 percent share. And, paradoxically perhaps, the business case for telcos to deploy additional fiber to the home gets more challenging, even as the cost of building FTTH arguably has fallen. 

The reason is that where FTTH builds by telcos could rely on substantial potential from a combination of voice, video entertainment and internet access, the dwindling demand for fixed network voice and linear video entertainment means the revenue case increasingly is built on internet access share gains. 

Fixed network voice lines, for example, fell by half between 2000 and 2020, even as households grew 24 percent and employee counts grew 22 percent, both of which represent increased demand for communication services, if not wired voice in particular. 

Demand for linear video services also is diminishing, though slowly. The bigger issue arguably is profit on such services, rather than gross revenue. Where video cost of goods once was on the order of 48 percent to 50 percent for the largest service providers, cost of goods in 2019 is probably closer to 60 percent of gross revenue. 

If all other direct costs (marketing, operations) are only 10 percent of revenue, that implies gross margins of up to 40 percent in the past, now perhaps 30 percent or lower. 

Small telcos and cable TV companies might have next to zero or negative margins on linear video in 2020

In the mobile segment of the business, incumbents still rule the day. But the cable companies, Dish Network and other potential competitors are coming.

AI Physical Interfaces Not as Important as Virtual

Microsoft’s dedicated AI key on some keyboards--which opens up access to Microsoft’s Copilot--now is joined by Logitech’s Signature AI mouse...