Thursday, December 12, 2019

Will Ruinous Competition Return?

It has been some time since many contestants in telecom had to worry about the effects of ruinous levels of competition, which were obvious and widespread in parts of the telecom market very early in the 21st century.

Sometimes markets endure what might be termed excessive or ruinous competition, where no company is a sector is profitable.

That arguably is the case for India, where industry revenues dropped seven percent last year, following years of such results, leading regulators to consider instituting  minimum prices as a way of boosting profits. 

Such situations are not new, as early developments in the railroad industry suggest. In fact, sometimes competitors might price in predatory fashion, deliberately selling below cost in an effort to drive other competitors out of business. That sort of behavior often is prohibited by law, and can trigger antitrust action. 

Even if technology has changed network costs and economics, allowing sustained competition between firms of equal size, the unanswered question for competitive markets has been the possible outcomes of ruinous levels of competition. 

Stable market structures often have market shares that are quite unequal, which prevents firms from launching ruinous pricing attacks. 

A ratio of 2:1 in market share between any two competitors seems to be the equilibrium point at which it is neither practical nor advantageous for either competitor to increase or decrease share. 

A market with three roughly equally-situated contestants means there always will be a temptation to launch disruptive attacks, especially if one of the three has such a strategy already. 

Some studies suggest a stable market of three firms features a market share pattern of approximately 4:2:1, where each contestant has double the market share of the following contestant. 

The hypothetical stable market structure is one where market shares are unequal enough, and the leader financially strong enough, to whether any disruptive attack by the number two or number three providers. That oligopolistic structure is stable, yet arguably provides competitive benefits. 

In a classic oligopolistic market, one might expect to see an “ideal” (normative) structure something like:

Oligopoly Market Share of Sales
Number one
41%
Number two
31%
Number three
16%

As a theoretical rule, one might argue, an oligopolistic market with three leading providers will tend to be stable when market shares follow a general pattern of 40 percent, 30 percent, 20 percent market shares held by three contestants.

Another unanswered question is the minimum possible competitive market structure, where consumer benefits still are obtained but firms also can sustain themselves. Regulators have grappled with the answer largely in terms of the minimum number of viable competitors in mobile markets, the widespread thinking being that only a single facilities-based fixed network operator is possible in many countries. 

In a minority of countries, it has seemed possible for at least two fixed network suppliers to operate at scale, on a sustainable basis. 

The point is that, long-term, sustainable competition in the facilities-based parts of  the telecom business is likely to take an oligopolistic shape over the long term, and that is likely the best outcome, providing sustainable competition and consumer benefits, without ruinous levels of competition.

4K and 5G Face One Similar Problem

4K and 5G face one similar problem: performance advantages are not always able to enable better experience that clearly is perceivable by end users and customers. It is not a new problem. 

Speeds and feeds, the measurement of machine tool performance, long has been used in the computing industry as well, touting technical features and performance of processors or networks. 

Marketing based on speeds and feeds fell out of favor, however, in part because every supplier was using the same systems and chips, negating the value of such claims. Also, at some point, the rate of improvement slowed, and it also became harder to show how the better performance was reflected in actual experience. 

We are likely to see something similar where it comes to the ability of apps, devices or networks to support very-high resolution video such as 4K. Likewise, much-faster mobile and fixed networks face the same problem: the technological advances do not lead to experience advantages. 

4K video on small screens has been characterized as offering visual and experience differences somewhere between indistinguishable and non-existent. The reason is the visual acuity of the human eye. Beyond some point, at some distance from any screen, the eye cannot resolve the greater granularity of picture elements. In other words, you cannot see the difference. 

Even for younger adults (20s and 30s) with better eyesight than older people, the difference between 2K resolution and 4K on a phone is imperceptible, if perceivable at all, one study found. 

On huge screens, relatively close to where an observer is located, the greater resolution does make a difference. Conversely, on small screens or beyond a certain distance, the eye cannot distinguish between 4K and 1080 HDTV. 

Also, battery life and processor overhead are reasons--aside from visual clarity--why 4K on a smartphone might arguably be worse than 1080p resolution. If 4K requires more energy, and right now it does, then battery consumption rate is a negative.

Granted, it is possible, perhaps even likely, that 5K will prove an advantage for virtual reality or augmented reality applications. Eyes are very close to screens on VR headsets. That likely will be true for 360-degree 360-degree VR

But in most other cases, smartphones with 4K displays will not yield an advantage humans can see. 

Something like that also will happen with 5G. People sometimes tout the advantage of 5G for video streaming. But streaming services such as Netflix require, by some estimates, only about 5 Mbps to 8 Mbps

True, Netflix recommends speeds of 25 Mbps for 4K content, so in some cases, 5G might well provide a better experience than 4G. But Amazon Prime says 15 Mbps for 4K content is sufficient. 

And if viewers really cannot tell the difference between 1080 resolution and 4K, then 8 Mbps is quite sufficient for viewing streamed content at high-definition quality. In fact, usage allowances are far more important than bandwidth, for most purposes. 


Some internet service providers also point out that a connection running at 25 Mbps downstream, and 12.5 Mbps upstream, outperforms a connection offering 100 Mbps downstream and 10 Mbps upstream. 

The larger point is that some technological innovations, including 4K video and 5G networks, might not have as much impact on user experience as one might suppose, although some future use cases might well be different.

One View on Why Video Resolution Beyond 4K is Useless

Wednesday, December 11, 2019

2% of U.S. Households Buy Gigabit Internet Access

The overall percentage of U.S. fixed network internet access subscribers buying gigabit-speed service increased 25 percent, to 2.5 percent in the third quarter of 2019. In 2018 about two percent of U.S. households bought gigabit internet access service, according to Openvault. 

Other estimates peg gigabit take rates at about six percent. 

About 51 percent of U.S. fixed network internet access customers now buy service at 100 Mbps or higher. 

Some 35 percent buy service rated at  100 Mbps to 150 Mbps. About 27 percent buy service running between 50 Mbps and 75 Mbps. 

The percentage of U.S. homes able to buy gigabit service is at least 80 percent, as that is the percentage cable TV alone reaches, according to the NCTA. 

Average U.S. Fixed Network Internet Consumption Now 275 GB Per Month

In the third quarter of 2019 the average household--including both customers on unlimited and fixed usage plans--consumed about 275 gigabytes each month, up about 21 percent year over year from the third quarter of 2018. 

The weighted average data usage includes subscribers on both flat rate billing (FRB) and usage-based billing (UBB). Not surprisingly, customers on unlimited flat-rate accounts consumed more data than customers on usage-based plans. 

There are obvious implications for internet service providers, namely the usage growth rate of about 20 percent a year. That implies a doubling of consumption in less than four years. 

That usage profile also suggests the usage allowances suppliers of fixed wireless services also must match. 

In comparison, U.S. mobile users might consume between 4 GB per month and 6 GB per month on the mobile network. 

Tuesday, December 10, 2019

Telcos Will "Eat Their Own Dogfood" with Edge Computing to Support Their Virtualized Networks

Whether edge computing as a service becomes a significant revenue stream for connectivity providers remains to be seen. The recent announcement of Amazon Web Service Wavelengths restricts the telco role to supplier of rack space, for example.

But telcos will use significant amounts of edge computing to support their own virtualized networks. 

Telecom edge computing refers to computing performed by small data centers located as close to the customer as possible, owned and operated by a telco, and on telco-owned property. One definition might be computing no further than 30 miles from any end user location. 

Metro edge computing might occur more than 30 and up to 100 miles from any end user location. Both those sorts of edge computing, and including computing happening on a user device or on an enterprise campus or inside a building are the other types of edge computing. 

Some have estimated edge computing revenues of perhaps US$21 billion in 2020. This is up more than 100 percent from 2019, and the market is poised to grow more than 50 percent in 2021 as well, some estimate and Deloitte reports. 



Irrespective of any efforts to host or otherwise supply edge computing as a service, telcos and mobile operators are among the biggest current users of edge computing to support their own internal operations. 

“The largest demand for edge computing currently comes from communication network operators as they virtualize their wireline and wireless network infrastructure and accelerate their network upgrades, including 5G,” say the authors of the State of the Edge report

“Most edge investments today are for virtualizing wireline infrastructure, including SD-WAN equipment, core network routing and switching equipment and data gateways,” the report states. 

“Network function virtualization (NFV) and software defined networking (SDN) solutions that underpin next generation technologies like 5G and are being implemented on edge platforms that CNOs are deploying across their networks,” the report says. 

In other words, 5G and similar upgrades to the wireline networks will require edge computing for network virtualization and automation, as well as to enable new services. 

This will drive investment in edge data centers to support their own operations. 

The global power footprint of the edge computing equipment for CNOs is forecast to increase from 231 to 7383 megaWatts between 2019 and 2028. 

In 2019, communications service provider deployments will represent 22 percent of the global edge footprint, declining to perhaps 10 percent of total investment as other third party uses proliferate. 

In 2019, 94 percent of the footprint will be in central office sites, with the remaining six percent in access and aggregation sites. Between 2019 and 2028 the aggregation edge footprint for CNOs is forecast to increase from five to 38 percent of the total footprint. 


Powerline Electromagnetic Effect is No More Demonstrable Now than 40 Years Ago

In addition to concern about any possible causal link between mobile phones and cancer, some see dangers elsewhere, including electromagnetic fields around power lines. At least one older study suggested there is a connection between childhood leukemia and living near power lines.  

A 2018 paper in Nature reviewed dozens of studies over decades and determined that kids who lived within 50 meters (165 feet) of a 200 kilovolt or higher power line had a slightly elevated risk of contracting leukemia. 

But the researchers concluded that the intensity of magnetic fields couldn’t be the culprit because the intensity of the magnetic fields wasn’t high enough to explain the findings.

Power line magnetic fields top out at about 2.5 microteslas when you’re directly underneath, whereas the earth’s magnetic field, to which we’re all exposed all the time, varies from 25 to 65 microteslas, 10 to 26 times higher. In other words, it is impossible to separate the effect of living new power lines with generalized background effects. 

“Even those researchers who have found a correlation between high-voltage power lines and childhood leukemia are dubious that what they’re measuring has to do with power lines at all,” though.  In 2005, one researcher suggested that living near power lines is also correlated to something else that really does increase a child’s leukemia risk.

 “Reasons for the increased risk, found in this and many other studies, remain to be elucidated,” wrote the researchers.

Some researchers have speculated that something else is the kind of hygiene, nutrition, general quality of life, and chemical exposure conditions that exist in communities through which high-voltage power lines are allowed to cross.

How Important is Net Promoter Score in Telecom?

The net promoter score is considered useful as a predictor of potential revenue growth, the theory being that customers willing to recommend a firm are loyal, and therefore, repeat buyers. So the higher the net promoter score, the better positioned a firm is supposed to be, in terms of ability to generate a profit.

Bain and Company fellow Fred Reichheld, inventor of the net promoter score, an index of customer willingness to refer a product to others, once famously argued that loyal customers were more profitable. 

The argument is that loyal customers generate increasing profits each year they stay with a company, in part because they buy more, and because they impose fewer operating costs. They know how to use a company’s products, have figured out why they use a product and therefore are less likely to have questions about billing and other elements of the product experience. 

They also arguably make more referrals to others, which is what the NPS attempts to measure. In many cases, loyal customers might also be willing to pay a premium rather than switch. 

“In financial services, for example, a five percent increase in customer retention produces more than a 25 percent increase in profit,” Reichheld argued. 

But some question its relevance and predictive power, as popular as NPS is in many firms. “Two 2007 studies analyzing thousands of customer interviews said NPS doesn’t correlate with revenue or predict customer behavior any better than other survey-based metric,” two reporters for the Wall Street Journal report. “A 2015 study examining data on 80,000 customers from hundreds of brands said the score doesn’t explain the way people allocate their money.”

Of all the criticisms, lack of predictive capability might be the most significant, since that is what the NPS purports to do: predict repeat buying behavior. 

“The science behind NPS is bad,” says Timothy Keiningham, a marketing professor at St. John’s University in New York, and one of the co-authors of the three studies. “When people change their net promoter score, that has almost no relationship to how they divide their spending,” he said. 

Others might argue that social media has changed the way consumers “refer” others to companies and products. Some question the methodology

As valuable as the “loyalty drives profits” argument might be, it is reasonable to question how well the NPS, or any other metric purporting to demonstrate the causal effect of loyalty or satisfaction on repeat buying, actually can predict such behavior. 

For similar reasons, it might be fair to question relevance in some industries that habitually score at the very bottom of U.S. industries on NPS, such as internet service provider business or the cable TV business. Where NPS scores range from zero to 100, cable TV and ISP service ranks in negative numbers, 2019 U.S. NPS scores show. 


One issue with the NPS is that some argue customer satisfaction is what is measured, not loyalty. The difference is subtle, but possibly important. 

Surveys have shown that even satisfied customers will switch brands. The point of loyalty is that customers show resistance to switching behavior. And some point out that only “complete satisfaction” is highly correlated with loyalty (repeat buying behavior). Merely “satisfied” customers arguably are as fickle as unhappy customers. 


Those rankings are congruent with satisfaction surveys published by the ACSI, which show internet access and cable TV at the bottom of all industries ranked, virtually year after year. 



Some might argue that the NPS or other measures of satisfaction are more important in highly-competitive industries, while of little use in monopolized businesses. This 1995 chart shows how little customer satisfaction mattered in the telephone business, then a monopoly. Whether very satisfied or completely dissatisfied, buying behavior was not affected. There were no choices. 

These days, as the telecom business is significantly competitive, we can argue about the importance of customer satisfaction, to a degree. Possibly nobody would claim customer satisfaction does not matter, as a contributor to customer loyalty (repeat buying). But neither is it completely clear how important satisfaction actually happens to be. 

Nor is it possible to divorce the importance of customer targets from the broader satisfaction measures. Any firm has to match its offers with the right audience, not just the right features and value proposition.

Targeting the wrong customers will generally fail, with high rates of churn and customer dissatisfaction. The oft-cited example is chasing price-sensitive customers who will quickly churn off once the discounts end. 

Customer satisfaction is not the same as customer loyalty, in other words. But it might still be argued that net promoter scores do matter within an industry, as a way of measuring performance against a firm’s competitors. In other words, it might well matter if Verizon’s service gets higher NPS than Comcast. 

Still, little research seems to have been done on circumstances when NPS actually is misleading or irrelevant. Industries that are declining might be an instance where even lower NPS or higher NPS scores do not matter much, as revenues are shrinking inexorably. At the margin, slower rates of decline are better than faster rates of decline, so higher NPS might have some value. 

Stil, if demand is declining, ultimately even high NPS does not matter. The market is shrinking, so high recommendations will not fundamentally change revenue prospects.

Monday, December 9, 2019

FCC Says 5G Just as Safe as Other Gs

Are 5G or other mobile phones safe to use? The scientific evidence so far suggests the clear answer is “yes.” And after a new review in light of 5G network launches, the U.S. Federal Communications Commission has found no reason to modify its existing guidelines. 

“As noted by the FDA, there is no evidence to support that adverse health effects in humans are caused by exposures at, under, or even in some cases above, the current RF limits,” says the FCC. “Indeed, no scientific evidence establishes a causal link between wireless device use and cancer or other illnesses.”

The FDA also maintains that “[t]he scientific evidence does not show a danger to any users of cell phones from RF exposure, including children and teenagers,” the FCC says. 

The World Health Organization (WHO) states that “[f]rom all evidence accumulated so far, no adverse short-or long-term health effects have been shown to occur from the RF signals produced by base stations.

The FDA maintains that “[t]he weight of scientific evidence has not linked cell phones with any health problems”31 and that “the current safety limits for cell phones are acceptable for protecting the public health.”

Upon review of the record, the FCC says, “we find no appropriate basis for and thus decline to initiate a rulemaking to reevaluate the existing RF exposure limits.”

In a recent proceeding taking a long at radio frequency emissions standards, the FCC found a  “lack of data in the record to support modifying our existing exposure limits.”

“Specifically, no expert health agency expressed concern about the Commission’s RF exposure limits. Rather, agencies’ public statements continue to support the current limits,” the FCC says. 

“Our existing exposure limits were adopted following recommendations from the U.S. Environmental Protection Agency (EPA), the Food and Drug Administration (FDA), and other federal health and safety agencies,” the FCC says.

“While research on the health effects of RF energy continues, no evidence has moved our sister health and safety agencies to issue substantive policy recommendations for strengthening RF exposure regulation,” the FCC says. 

Indeed, says the FCC, the standards some argue the FCC should adopt are millions to billions times more restrictive than FCC limits. The practical result is that “no device could reliably transmit any usable level of energy by today’s technological standards while meeting those limits.” In other words, no cell phone use whatsoever, by anybody. 

“There is no scientific evidence in the record that such restrictive limits would produce any tangible benefit to human health, or provide any improvement over current protections against established risks,” the FCC says. 

Is Private Equity "Good" for the Housing Market?

Even many who support allowing market forces to work might question whether private equity involvement in the U.S. housing market “has bee...