Thursday, December 12, 2019

AT&T Expects to Reach 50% Internet Access Market Share in FTTH Areas

AT&T believes it will eventually get take rates for its fiber-to-home service, across 14.5 million households, “to a 50 percent mark over the three-year period” from activation, said Jeffery McElfresh,  AT&T Communications CEO. 

AT&T bases that forecast on past experience. “As you look at the fiber that we built out in the ground in 2016, at the three-year mark, we roughly approach about a 50 percent share gain in that territory,” said McElfresh.

Adoption levels at that level would be historically high for an incumbent telco, as Verizon has in the past gotten FiOS adoption in the 40-percent range after about three years. 

Take rates for FTTH services globally vary quite a lot, and may be an artifact of network coverage. U.S. FTTH accounts are low, but mostly because perhaps 66 percent of fixed network accounts are on cable TV hybrid fiber coax plant. Telcos collectively have only about 33 percent market share. 

South Korea seems an odd case. In South Korea, FTTH  take rates seem to be only about 10 percent, for example, though network coverage is about 99 percent. 

In Japan and New Zealand, take rates have reached the mid-40-percent range, and network coverage might be about 90 percent. But in France and the United Kingdom, FTTH adoption is in the low-20-percent range. 

That is why AT&T’s expectation that its FTTH adoption will reach 50 percent is important. It would reverse the traditional market share of a telco in the fixed network internet access market from 33 percent of customers to perhaps half.

Will Ruinous Competition Return?

It has been some time since many contestants in telecom had to worry about the effects of ruinous levels of competition, which were obvious and widespread in parts of the telecom market very early in the 21st century.

Sometimes markets endure what might be termed excessive or ruinous competition, where no company is a sector is profitable.

That arguably is the case for India, where industry revenues dropped seven percent last year, following years of such results, leading regulators to consider instituting  minimum prices as a way of boosting profits. 

Such situations are not new, as early developments in the railroad industry suggest. In fact, sometimes competitors might price in predatory fashion, deliberately selling below cost in an effort to drive other competitors out of business. That sort of behavior often is prohibited by law, and can trigger antitrust action. 

Even if technology has changed network costs and economics, allowing sustained competition between firms of equal size, the unanswered question for competitive markets has been the possible outcomes of ruinous levels of competition. 

Stable market structures often have market shares that are quite unequal, which prevents firms from launching ruinous pricing attacks. 

A ratio of 2:1 in market share between any two competitors seems to be the equilibrium point at which it is neither practical nor advantageous for either competitor to increase or decrease share. 

A market with three roughly equally-situated contestants means there always will be a temptation to launch disruptive attacks, especially if one of the three has such a strategy already. 

Some studies suggest a stable market of three firms features a market share pattern of approximately 4:2:1, where each contestant has double the market share of the following contestant. 

The hypothetical stable market structure is one where market shares are unequal enough, and the leader financially strong enough, to whether any disruptive attack by the number two or number three providers. That oligopolistic structure is stable, yet arguably provides competitive benefits. 

In a classic oligopolistic market, one might expect to see an “ideal” (normative) structure something like:

Oligopoly Market Share of Sales
Number one
41%
Number two
31%
Number three
16%

As a theoretical rule, one might argue, an oligopolistic market with three leading providers will tend to be stable when market shares follow a general pattern of 40 percent, 30 percent, 20 percent market shares held by three contestants.

Another unanswered question is the minimum possible competitive market structure, where consumer benefits still are obtained but firms also can sustain themselves. Regulators have grappled with the answer largely in terms of the minimum number of viable competitors in mobile markets, the widespread thinking being that only a single facilities-based fixed network operator is possible in many countries. 

In a minority of countries, it has seemed possible for at least two fixed network suppliers to operate at scale, on a sustainable basis. 

The point is that, long-term, sustainable competition in the facilities-based parts of  the telecom business is likely to take an oligopolistic shape over the long term, and that is likely the best outcome, providing sustainable competition and consumer benefits, without ruinous levels of competition.

4K and 5G Face One Similar Problem

4K and 5G face one similar problem: performance advantages are not always able to enable better experience that clearly is perceivable by end users and customers. It is not a new problem. 

Speeds and feeds, the measurement of machine tool performance, long has been used in the computing industry as well, touting technical features and performance of processors or networks. 

Marketing based on speeds and feeds fell out of favor, however, in part because every supplier was using the same systems and chips, negating the value of such claims. Also, at some point, the rate of improvement slowed, and it also became harder to show how the better performance was reflected in actual experience. 

We are likely to see something similar where it comes to the ability of apps, devices or networks to support very-high resolution video such as 4K. Likewise, much-faster mobile and fixed networks face the same problem: the technological advances do not lead to experience advantages. 

4K video on small screens has been characterized as offering visual and experience differences somewhere between indistinguishable and non-existent. The reason is the visual acuity of the human eye. Beyond some point, at some distance from any screen, the eye cannot resolve the greater granularity of picture elements. In other words, you cannot see the difference. 

Even for younger adults (20s and 30s) with better eyesight than older people, the difference between 2K resolution and 4K on a phone is imperceptible, if perceivable at all, one study found. 

On huge screens, relatively close to where an observer is located, the greater resolution does make a difference. Conversely, on small screens or beyond a certain distance, the eye cannot distinguish between 4K and 1080 HDTV. 

Also, battery life and processor overhead are reasons--aside from visual clarity--why 4K on a smartphone might arguably be worse than 1080p resolution. If 4K requires more energy, and right now it does, then battery consumption rate is a negative.

Granted, it is possible, perhaps even likely, that 5K will prove an advantage for virtual reality or augmented reality applications. Eyes are very close to screens on VR headsets. That likely will be true for 360-degree 360-degree VR

But in most other cases, smartphones with 4K displays will not yield an advantage humans can see. 

Something like that also will happen with 5G. People sometimes tout the advantage of 5G for video streaming. But streaming services such as Netflix require, by some estimates, only about 5 Mbps to 8 Mbps

True, Netflix recommends speeds of 25 Mbps for 4K content, so in some cases, 5G might well provide a better experience than 4G. But Amazon Prime says 15 Mbps for 4K content is sufficient. 

And if viewers really cannot tell the difference between 1080 resolution and 4K, then 8 Mbps is quite sufficient for viewing streamed content at high-definition quality. In fact, usage allowances are far more important than bandwidth, for most purposes. 


Some internet service providers also point out that a connection running at 25 Mbps downstream, and 12.5 Mbps upstream, outperforms a connection offering 100 Mbps downstream and 10 Mbps upstream. 

The larger point is that some technological innovations, including 4K video and 5G networks, might not have as much impact on user experience as one might suppose, although some future use cases might well be different.

One View on Why Video Resolution Beyond 4K is Useless

Wednesday, December 11, 2019

2% of U.S. Households Buy Gigabit Internet Access

The overall percentage of U.S. fixed network internet access subscribers buying gigabit-speed service increased 25 percent, to 2.5 percent in the third quarter of 2019. In 2018 about two percent of U.S. households bought gigabit internet access service, according to Openvault. 

Other estimates peg gigabit take rates at about six percent. 

About 51 percent of U.S. fixed network internet access customers now buy service at 100 Mbps or higher. 

Some 35 percent buy service rated at  100 Mbps to 150 Mbps. About 27 percent buy service running between 50 Mbps and 75 Mbps. 

The percentage of U.S. homes able to buy gigabit service is at least 80 percent, as that is the percentage cable TV alone reaches, according to the NCTA. 

Average U.S. Fixed Network Internet Consumption Now 275 GB Per Month

In the third quarter of 2019 the average household--including both customers on unlimited and fixed usage plans--consumed about 275 gigabytes each month, up about 21 percent year over year from the third quarter of 2018. 

The weighted average data usage includes subscribers on both flat rate billing (FRB) and usage-based billing (UBB). Not surprisingly, customers on unlimited flat-rate accounts consumed more data than customers on usage-based plans. 

There are obvious implications for internet service providers, namely the usage growth rate of about 20 percent a year. That implies a doubling of consumption in less than four years. 

That usage profile also suggests the usage allowances suppliers of fixed wireless services also must match. 

In comparison, U.S. mobile users might consume between 4 GB per month and 6 GB per month on the mobile network. 

Tuesday, December 10, 2019

Telcos Will "Eat Their Own Dogfood" with Edge Computing to Support Their Virtualized Networks

Whether edge computing as a service becomes a significant revenue stream for connectivity providers remains to be seen. The recent announcement of Amazon Web Service Wavelengths restricts the telco role to supplier of rack space, for example.

But telcos will use significant amounts of edge computing to support their own virtualized networks. 

Telecom edge computing refers to computing performed by small data centers located as close to the customer as possible, owned and operated by a telco, and on telco-owned property. One definition might be computing no further than 30 miles from any end user location. 

Metro edge computing might occur more than 30 and up to 100 miles from any end user location. Both those sorts of edge computing, and including computing happening on a user device or on an enterprise campus or inside a building are the other types of edge computing. 

Some have estimated edge computing revenues of perhaps US$21 billion in 2020. This is up more than 100 percent from 2019, and the market is poised to grow more than 50 percent in 2021 as well, some estimate and Deloitte reports. 



Irrespective of any efforts to host or otherwise supply edge computing as a service, telcos and mobile operators are among the biggest current users of edge computing to support their own internal operations. 

“The largest demand for edge computing currently comes from communication network operators as they virtualize their wireline and wireless network infrastructure and accelerate their network upgrades, including 5G,” say the authors of the State of the Edge report

“Most edge investments today are for virtualizing wireline infrastructure, including SD-WAN equipment, core network routing and switching equipment and data gateways,” the report states. 

“Network function virtualization (NFV) and software defined networking (SDN) solutions that underpin next generation technologies like 5G and are being implemented on edge platforms that CNOs are deploying across their networks,” the report says. 

In other words, 5G and similar upgrades to the wireline networks will require edge computing for network virtualization and automation, as well as to enable new services. 

This will drive investment in edge data centers to support their own operations. 

The global power footprint of the edge computing equipment for CNOs is forecast to increase from 231 to 7383 megaWatts between 2019 and 2028. 

In 2019, communications service provider deployments will represent 22 percent of the global edge footprint, declining to perhaps 10 percent of total investment as other third party uses proliferate. 

In 2019, 94 percent of the footprint will be in central office sites, with the remaining six percent in access and aggregation sites. Between 2019 and 2028 the aggregation edge footprint for CNOs is forecast to increase from five to 38 percent of the total footprint. 


Will AI Fuel a Huge "Services into Products" Shift?

As content streaming has disrupted music, is disrupting video and television, so might AI potentially disrupt industry leaders ranging from ...