Tuesday, December 17, 2019

Telecom Customer Satisfaction Might Not be as Bad as Some Think

Customer satisfaction with telecom services often measures lower than in other industries. But that does not necessarily mean customer satisfaction with telecom services is “poor.” To be sure, one net promoter score survey scored video subscriptions at “zero,” on average. That sounds terrible, but net promoter scores, a measure of customer satisfaction measured by willingness to recommend a firm, are measured on a scale of minus 100 to positive 100. 

So a score of zero is actually right at the edge of a  “good” score. Many consider any score up to 30 to be in the “good” category. That is not to say many connectivity services generally are considered “very good” by most consumers. Connectivity services typically do not have the highest industry scores. But a “good” score is not a bad thing. 

Also, there is some evidence that connectivity suppliers naturally spend more effort on their larger--and higher revenue generating--customers. One survey of NPS by Analysys Mason shows precisely that trend. Larger firms--assumed to be larger accounts--are more satisfied than small businesses, for example. 

Though mobile service provider satisfaction tends to be higher than that of fixed services, the same pattern holds: bigger organizations report higher NPS. 

Such low scores do not appear unusual. A 2018 analysys actually produced negative average net promoter scores for video subscription services, while mobile service providers averaged a 22 score.


Monday, December 16, 2019

AI Performance Now Exceeds Moore's Law Rates

Prior to 2012, artificial intelligence improvement closely tracked Moore’s Law, with compute doubling every two years. Post-2012, compute has been doubling every 3.4 months, according to the Artificial Intelligence Index Report 2019.  

The amount of computation used in the largest AI training runs has doubled every 3.4 months since 2012 (net increase of 300,000 times). The y-axis of the chart shows the total amount of compute, in petaflop/s-days, used to train selected results. 

A petaflop-day (pf-day) consists of performing1015 neural net operations per second for one day, or a total of about 1020 operations. 



Early 5G is More Supplier Push Than End User Pull

We sometimes forget that multiple drivers can exist whenver a mobile operator deploys a next-generation network, and consumer "need" or "demand" is only one of those drivers. One value of 4G was that it lowered the cost per bit of mobile data. The same will be true of 5G.

One advantage of 4G small cells is the ability to supply more capacity at usage hotspots. 5G small cells are going to do the same. Again, the advantage only indirectly accrues to end users. The direct advantage is gained by the service provider.

Nor is it clear that 5G ultra-low latency actually directly leads to improved end user experience. Without additional investments in edge computing, app providers cannot actually change end-to-end experience, so end users can experience the changes.

Much early 5G adoption will be supplier push, not end user pull, in other words.

International Data Corporation projects the number of 5G connections to grow from roughly 10.0 million in 2019 to 1.01 billion in 2023, a compound annual growth rate of 217.2 percent over the 2019 to 2023 forecast period. By 2023, IDC expects 5G will represent 8.9 percent of all mobile device connections.

At this point, with the exception of 5G fixed wireless, which provides perhaps the clearest example of demand-pull, much of the 5G adoption will likely be demand-push. In the former case, customers drive adoption because they see some value in doing so. In the latter case, service providers convince consumers to buy 5G because it is in the service provider’s best interest to do so, irrespective of existing consumer demand. 

In large part, that is because many of the “5G” end user benefits actually hinge on simultaneous availability of other capabilities as well, ranging from edge computing to network slicing to artificial intelligence to internet of things use cases. 

It is one thing to tout the near-zero latency of the 5G access network. But that is not the same thing as end-to-end application latency. How much advantage 5G network speeds will provide also is somewhat conditional. 

On low-band networks there will be some improvement over 4G performance, but nothing like the order of magnitude gains from use of millimeter wave spectrum. But even any improvement in speeds must be evaluated against actual end user experience changes. In many, if not most cases, the additional speed, itself, will not lead to markedly-improved experience, except for big file downloads or some virtual reality or augmented reality use cases. 

IDG analysts sort of hint at this. Of the forces driving 5G adoption over the next several years, 5G will be adopted because “shifting data-intensive users and use cases to 5G will allow network operators to more efficiently manage network resources, improving performance and reliability as a result,” IDG says. 

The argument is not that users benefit, but that service providers benefit. 

Internet of things provides another example. “The need to support millions of connected endpoints at the same time will become increasingly critical,” IDG says. :5G's densification advantage be key for mobile network operators in providing reliable network performance.”

Again, mobile network operators gain, not end users directly. 

Low latency and higher speed likewise are said to potentially benefit enterprises. “Many of these use cases will come from businesses looking to leverage 5G's technological advantages in their edge computing, artificial intelligence, and cloud services initiatives,” IDG says.

Sunday, December 15, 2019

5G Probably Benefits Mobile Operators More than Consumers, at First

Some complain that 5G is not being introduced fast enough in the U.S. market. But phased 5G service coverage is not the problem believe. Building one new continent-sized network always takes years. Building four simultaneously is harder. That is true of all next-generation mobile networks. It has been a decade since people lived through such a change--and some never have--so we tend to forget that. 

Still, compared to the introduction of 4G, phased deployment might not matter so much, for reasons of end user experience, service provider economics and ecosystem dynamics. We tend to forget that major device suppliers, such as Apple, historically lag the networks. 

The first Apple iPhone launched using 2G. A 3G iPhone was not launched until 2008, the year 4G launched in the U.S. market. A 4g iPhone was not available until 2013, about five years after 4G first appeared. It appears the iPhone 12, expected, in 2020, might feature both 4G and 5G models.  

The point is that the most-rapid-possible adoption is not always meaningful, nor harmful. 

That might be especially true in the 5G era, when many so-called 5G impacts actually are enabled by edge computing, internet of things, artificial intelligence or other correlated developments, not 5G itself. 

Are there network effects? Yes. Whenever an app, a process or a service or a product requires scale to provide value, it is said to be an example of network effects. Phone service, use of facsimile machines, social networks and online marketplaces of all sorts provide examples. 

Some point out that 5G is no different: 5G phones are most valuable when there is 5G service available, and 5G networks make 5G phone purchases more valuable. It’s a bit of a nuance, but what is not true is that 5G phones make mobile networks valuable, or that 5G networks make devices valuable. 

A 5G device can use a 4G network, and vice versa. So with the exception of any device features specifically related to 5G (access, mostly), all the other device value is obtainable, even when a user decides not to use--or cannot use--a 5G network. 

The point is that the speed of the 5G rollout might not matter to consumers so much. It will matter for mobile network operators, who can gain subscribers or lose them, based on the completeness and availability of their 5G offers. 

5G also will matter for mobile service providers with limited spectrum resources, as 5G will allow customers to shift usage to a new network featuring lots of bandwidth (millimeter wave, especially) and little contention, at the moment. As each user switches off 4G and on to 5G, experience for all the remaining 4G users improves, as there is less network loading. 

5G ubiquity also matters for developers of 5G use cases, apps and services, since there scale really does matter. And 5G device manufacturers also benefit from greater demand as customers actually can generally use their devices on the 5G networks. 

There, the actual performance advantages arguably are less important than the customer “knowing they can use a network they are paying for (in terms of new device purchase and possibly new service plan). 

Faster networks are “better” than slower networks, generally speaking. But that is conditional. If the user cannot benefit from the additional speed, does faster speed really matter? 

It matters most for service providers, who have to keep increasing capacity, but must realistically expect to do so for roughly the same prices as at present, best case. Lower cost per bit, in other words, is the key value, but for service providers, not consumers, directly. 

Thursday, December 12, 2019

AT&T Expects to Reach 50% Internet Access Market Share in FTTH Areas

AT&T believes it will eventually get take rates for its fiber-to-home service, across 14.5 million households, “to a 50 percent mark over the three-year period” from activation, said Jeffery McElfresh,  AT&T Communications CEO. 

AT&T bases that forecast on past experience. “As you look at the fiber that we built out in the ground in 2016, at the three-year mark, we roughly approach about a 50 percent share gain in that territory,” said McElfresh.

Adoption levels at that level would be historically high for an incumbent telco, as Verizon has in the past gotten FiOS adoption in the 40-percent range after about three years. 

Take rates for FTTH services globally vary quite a lot, and may be an artifact of network coverage. U.S. FTTH accounts are low, but mostly because perhaps 66 percent of fixed network accounts are on cable TV hybrid fiber coax plant. Telcos collectively have only about 33 percent market share. 

South Korea seems an odd case. In South Korea, FTTH  take rates seem to be only about 10 percent, for example, though network coverage is about 99 percent. 

In Japan and New Zealand, take rates have reached the mid-40-percent range, and network coverage might be about 90 percent. But in France and the United Kingdom, FTTH adoption is in the low-20-percent range. 

That is why AT&T’s expectation that its FTTH adoption will reach 50 percent is important. It would reverse the traditional market share of a telco in the fixed network internet access market from 33 percent of customers to perhaps half.

Will Ruinous Competition Return?

It has been some time since many contestants in telecom had to worry about the effects of ruinous levels of competition, which were obvious and widespread in parts of the telecom market very early in the 21st century.

Sometimes markets endure what might be termed excessive or ruinous competition, where no company is a sector is profitable.

That arguably is the case for India, where industry revenues dropped seven percent last year, following years of such results, leading regulators to consider instituting  minimum prices as a way of boosting profits. 

Such situations are not new, as early developments in the railroad industry suggest. In fact, sometimes competitors might price in predatory fashion, deliberately selling below cost in an effort to drive other competitors out of business. That sort of behavior often is prohibited by law, and can trigger antitrust action. 

Even if technology has changed network costs and economics, allowing sustained competition between firms of equal size, the unanswered question for competitive markets has been the possible outcomes of ruinous levels of competition. 

Stable market structures often have market shares that are quite unequal, which prevents firms from launching ruinous pricing attacks. 

A ratio of 2:1 in market share between any two competitors seems to be the equilibrium point at which it is neither practical nor advantageous for either competitor to increase or decrease share. 

A market with three roughly equally-situated contestants means there always will be a temptation to launch disruptive attacks, especially if one of the three has such a strategy already. 

Some studies suggest a stable market of three firms features a market share pattern of approximately 4:2:1, where each contestant has double the market share of the following contestant. 

The hypothetical stable market structure is one where market shares are unequal enough, and the leader financially strong enough, to whether any disruptive attack by the number two or number three providers. That oligopolistic structure is stable, yet arguably provides competitive benefits. 

In a classic oligopolistic market, one might expect to see an “ideal” (normative) structure something like:

Oligopoly Market Share of Sales
Number one
41%
Number two
31%
Number three
16%

As a theoretical rule, one might argue, an oligopolistic market with three leading providers will tend to be stable when market shares follow a general pattern of 40 percent, 30 percent, 20 percent market shares held by three contestants.

Another unanswered question is the minimum possible competitive market structure, where consumer benefits still are obtained but firms also can sustain themselves. Regulators have grappled with the answer largely in terms of the minimum number of viable competitors in mobile markets, the widespread thinking being that only a single facilities-based fixed network operator is possible in many countries. 

In a minority of countries, it has seemed possible for at least two fixed network suppliers to operate at scale, on a sustainable basis. 

The point is that, long-term, sustainable competition in the facilities-based parts of  the telecom business is likely to take an oligopolistic shape over the long term, and that is likely the best outcome, providing sustainable competition and consumer benefits, without ruinous levels of competition.

4K and 5G Face One Similar Problem

4K and 5G face one similar problem: performance advantages are not always able to enable better experience that clearly is perceivable by end users and customers. It is not a new problem. 

Speeds and feeds, the measurement of machine tool performance, long has been used in the computing industry as well, touting technical features and performance of processors or networks. 

Marketing based on speeds and feeds fell out of favor, however, in part because every supplier was using the same systems and chips, negating the value of such claims. Also, at some point, the rate of improvement slowed, and it also became harder to show how the better performance was reflected in actual experience. 

We are likely to see something similar where it comes to the ability of apps, devices or networks to support very-high resolution video such as 4K. Likewise, much-faster mobile and fixed networks face the same problem: the technological advances do not lead to experience advantages. 

4K video on small screens has been characterized as offering visual and experience differences somewhere between indistinguishable and non-existent. The reason is the visual acuity of the human eye. Beyond some point, at some distance from any screen, the eye cannot resolve the greater granularity of picture elements. In other words, you cannot see the difference. 

Even for younger adults (20s and 30s) with better eyesight than older people, the difference between 2K resolution and 4K on a phone is imperceptible, if perceivable at all, one study found. 

On huge screens, relatively close to where an observer is located, the greater resolution does make a difference. Conversely, on small screens or beyond a certain distance, the eye cannot distinguish between 4K and 1080 HDTV. 

Also, battery life and processor overhead are reasons--aside from visual clarity--why 4K on a smartphone might arguably be worse than 1080p resolution. If 4K requires more energy, and right now it does, then battery consumption rate is a negative.

Granted, it is possible, perhaps even likely, that 5K will prove an advantage for virtual reality or augmented reality applications. Eyes are very close to screens on VR headsets. That likely will be true for 360-degree 360-degree VR

But in most other cases, smartphones with 4K displays will not yield an advantage humans can see. 

Something like that also will happen with 5G. People sometimes tout the advantage of 5G for video streaming. But streaming services such as Netflix require, by some estimates, only about 5 Mbps to 8 Mbps

True, Netflix recommends speeds of 25 Mbps for 4K content, so in some cases, 5G might well provide a better experience than 4G. But Amazon Prime says 15 Mbps for 4K content is sufficient. 

And if viewers really cannot tell the difference between 1080 resolution and 4K, then 8 Mbps is quite sufficient for viewing streamed content at high-definition quality. In fact, usage allowances are far more important than bandwidth, for most purposes. 


Some internet service providers also point out that a connection running at 25 Mbps downstream, and 12.5 Mbps upstream, outperforms a connection offering 100 Mbps downstream and 10 Mbps upstream. 

The larger point is that some technological innovations, including 4K video and 5G networks, might not have as much impact on user experience as one might suppose, although some future use cases might well be different.

One View on Why Video Resolution Beyond 4K is Useless

Wednesday, December 11, 2019

2% of U.S. Households Buy Gigabit Internet Access

The overall percentage of U.S. fixed network internet access subscribers buying gigabit-speed service increased 25 percent, to 2.5 percent in the third quarter of 2019. In 2018 about two percent of U.S. households bought gigabit internet access service, according to Openvault. 

Other estimates peg gigabit take rates at about six percent. 

About 51 percent of U.S. fixed network internet access customers now buy service at 100 Mbps or higher. 

Some 35 percent buy service rated at  100 Mbps to 150 Mbps. About 27 percent buy service running between 50 Mbps and 75 Mbps. 

The percentage of U.S. homes able to buy gigabit service is at least 80 percent, as that is the percentage cable TV alone reaches, according to the NCTA. 

Average U.S. Fixed Network Internet Consumption Now 275 GB Per Month

In the third quarter of 2019 the average household--including both customers on unlimited and fixed usage plans--consumed about 275 gigabytes each month, up about 21 percent year over year from the third quarter of 2018. 

The weighted average data usage includes subscribers on both flat rate billing (FRB) and usage-based billing (UBB). Not surprisingly, customers on unlimited flat-rate accounts consumed more data than customers on usage-based plans. 

There are obvious implications for internet service providers, namely the usage growth rate of about 20 percent a year. That implies a doubling of consumption in less than four years. 

That usage profile also suggests the usage allowances suppliers of fixed wireless services also must match. 

In comparison, U.S. mobile users might consume between 4 GB per month and 6 GB per month on the mobile network. 

Tuesday, December 10, 2019

Telcos Will "Eat Their Own Dogfood" with Edge Computing to Support Their Virtualized Networks

Whether edge computing as a service becomes a significant revenue stream for connectivity providers remains to be seen. The recent announcement of Amazon Web Service Wavelengths restricts the telco role to supplier of rack space, for example.

But telcos will use significant amounts of edge computing to support their own virtualized networks. 

Telecom edge computing refers to computing performed by small data centers located as close to the customer as possible, owned and operated by a telco, and on telco-owned property. One definition might be computing no further than 30 miles from any end user location. 

Metro edge computing might occur more than 30 and up to 100 miles from any end user location. Both those sorts of edge computing, and including computing happening on a user device or on an enterprise campus or inside a building are the other types of edge computing. 

Some have estimated edge computing revenues of perhaps US$21 billion in 2020. This is up more than 100 percent from 2019, and the market is poised to grow more than 50 percent in 2021 as well, some estimate and Deloitte reports. 



Irrespective of any efforts to host or otherwise supply edge computing as a service, telcos and mobile operators are among the biggest current users of edge computing to support their own internal operations. 

“The largest demand for edge computing currently comes from communication network operators as they virtualize their wireline and wireless network infrastructure and accelerate their network upgrades, including 5G,” say the authors of the State of the Edge report

“Most edge investments today are for virtualizing wireline infrastructure, including SD-WAN equipment, core network routing and switching equipment and data gateways,” the report states. 

“Network function virtualization (NFV) and software defined networking (SDN) solutions that underpin next generation technologies like 5G and are being implemented on edge platforms that CNOs are deploying across their networks,” the report says. 

In other words, 5G and similar upgrades to the wireline networks will require edge computing for network virtualization and automation, as well as to enable new services. 

This will drive investment in edge data centers to support their own operations. 

The global power footprint of the edge computing equipment for CNOs is forecast to increase from 231 to 7383 megaWatts between 2019 and 2028. 

In 2019, communications service provider deployments will represent 22 percent of the global edge footprint, declining to perhaps 10 percent of total investment as other third party uses proliferate. 

In 2019, 94 percent of the footprint will be in central office sites, with the remaining six percent in access and aggregation sites. Between 2019 and 2028 the aggregation edge footprint for CNOs is forecast to increase from five to 38 percent of the total footprint. 


On the Use and Misuse of Principles, Theorems and Concepts

When financial commentators compile lists of "potential black swans," they misunderstand the concept. As explained by Taleb Nasim ...