Friday, November 3, 2017

Why Overbuilding is So Rare

Both for-profit and non-profit entities long have looked at whether it is feasible to compete against cable TV and telcos for internet access, video and voice services using their own facilities, with the objective of providing higher-quality service (internet access speeds, for example) and lower cost.

Generally speaking, more success has been found by suppliers who, in fact, do not “overbuild” telco and cable TV networks across an entire metro footprint, but pick niches (apartments and high-rise living units; neighborhoods and suburbs).

That is the same “cherrypicking” strategy used by business service specialists, who only operate in business districts and office parks.

A study of potential lost market share if municipal internet access networks were built in Seattle and Fort Collins, Colo. assumes Comcast, the leading internet service provider in those cities, would lose 20 percent to 30 percent of its current customer base. That is a reasonable assumption, given other experience in the U.S. market.

Assuming 20 percent loss of share and $50 average revenue per account, Comcast might lose $1.38 million a month, or $16.6 million in annual revenue, in Seattle, or $373,500 a month ($4.5 million annual) in Fort Collins.

Seattle has about 340,479 housing units, of which 327,188 are occupied at any particular time. At an average network cost of about $700 per location passed (Seattle is highly urban, so costs to build a new cabled network are likely higher than that), the core network might cost $238 million.

Connecting an actual customer might cost $300 per location, if internet access is the only service offered.

At 20 percent take rates, the new provider might have 65,437 customers. That represents activation costs of about $19.6 million.

Assume the new provider sells internet access at the same average of $50 per account (so there actually is zero savings for each customer), but that delivered bandwidth is a gigabit per second, with annual revenue of $16.6 million.

Assume the new operator has a 40-percent gross margin (revenue minus direct costs), implying annual net cash of about $6.6 million (before taxes, interest, depreciation and amortization).

And, at five percent cost of capital, interest payments on network construction ($238 million) alone are $12 million a year.

Without getting any more detailed, the business case simply does not work.

It is fine to criticize ISPs for not delivering higher-quality services at more affordable prices. But the costs of building internet access infrastructure, using any cabled method, plus reasonable market share expectations, explain why facilities-based competition is so difficult.

Ubiquitous third suppliers might be an unreasonable expectation. Limited builds in some neighborhoods, suburbs or high-rise living units might have sustainable economics. But advocated likely hope in vain for the emergence of full citywide competitors.

Thursday, November 2, 2017

Mobile Networks Might Carry Half of IoT Traffic om 2025

Mobile networks will carry half of IoT data traffic by 2025, Machina Research predicts. Of the use cases of highest interest to mobile service providers, connected car is perhaps the most important, as it inherently requires a mobile access solution.

And though mobility might not be a fundamental requirement for many other use cases, the mobile network will prove to be the easiest solution, even if other connectivity options also exist, one might argue.  

source: AT&T

Wednesday, November 1, 2017

AI, Edge Computing, 5G, Big Data Analytics, IoT are All Parts of A Single Transformation

Artificial intelligence, edge computing, cloud computing, gigabit mobile networks (5G and others), internet of things and big data all are key trends across many industries.

What we tend to miss, as so much change is happening, is simply that so much change is coming. It is better to view the cluster of innovations as the big change, and not so much the disruption each separate trend represents.

For it is the cluster of technologies that is so unusual. In the past, it has been easier to model the impact of a single innovation (personal computer, mobile phone, internet). It will be much harder in the coming era, since so many fundamentally disruptive technologies are emerging at the same time.

Big data now requires cloud computing. Big data will get bigger as internet of things sensors are widely deployed. So only artificial intelligence can sift through all the data to discern useful patterns.

And some of that data will have to be analyzed fast enough that edge computing is necessary. But 5G and other connectivity solutions will be needed to acquire all the data.

It is nearly impossible for a human to model all the possible interactions with enough detail to make the output useful. From a mobile operator’s point of view, it might be logical to put 5G at the center.

Other industries are going to put AI, or cloud, or IoT or big data at the center. No matter. The point is that the cluster of technologies is what really matters, not any single one of the technology trends.

In this unusual situation, impact will not be measured by market share stats. The percentage of work loads, the location of work load processing, the number of sensors and connections, the use of analytics and machine learning systems will fail to tell the full story.



As useful as market share analysis might be, it fails to capture the underlying market dynamics when a disruption is underway. Consider that, after nearly two decades, online commerce claims only about seven percent of total retail commerce volume.


Most of us, asked to evaluate the potential impact of a substitute technology platform that has gotten only seven percent share after nearly two decades would likely say that technology is not a major disruptor of the legacy platform.


But we would be quite wrong. If history is a useful guide, we are about three share points away from a decisive change in the adoption rate--and market share--of the new platform.


The reason for that assertion is that, in the past, transformative technologies and successful consumer electronics innovations hit an inflection point at 10 percent adoption, no matter how incremental the prior moves had seemed.


This chart by Asymco shows adoption rates of popular consumer products after 10 percent adoption was reached, no matter how long the gestation.




As you probably would expect, particular products introduced into developed ecosystems tend to be adopted faster, while products that require further development of an ecosystem can take longer to reach 10-percent adoption rates. Automobiles required a huge infrastructure of roads and gas stations.


The telephone required network economics, as the value of having a phone line was fairly low, for most users, when few other people had them. Likewise, supplying electricity required power plants and transmission lines, plus local power distribution networks.




And though it is a lesser point, as in some other markets, online commerce represents virtually 100 percent of the net growth.
source: www.nielsen.com

Researchers have been predicting for several decades that computing is going to be pervasive. What we are seeing is the realization of that prediction. Some say "fourth industrial revolution" is coming. Some of us might simply say the era of pervasive computing, as predicted, is arriving.

It is not simply "production" that is going to be affected, affecting industries and economics. Human consumption, lifestyles and behaviors are going to change, as well. This is big, very big.

Sometimes Market Share Conceals More than it Reveals

As useful as market share analysis might be, it fails to capture the underlying market dynamics when a disruption is underway. Consider that, after nearly two decades, online commerce claims only about seven percent of total retail commerce volume.

Most of us, asked to evaluate the potential impact of a substitute technology platform that has gotten only seven percent share after nearly two decades would likely say that technology is not a major disruptor of the legacy platform.

But we would be quite wrong. If history is a useful guide, we are about three share points away from a decisive change in the adoption rate--and market share--of the new platform.

The reason for that assertion is that, in the past, transformative technologies and successful consumer electronics innovations hit an inflection point at 10 percent adoption, no matter how incremental the prior moves had seemed.

This chart by Asymco shows adoption rates of popular consumer products after 10 percent adoption was reached, no matter how long the gestation.


As you probably would expect, particular products introduced into developed ecosystems tend to be adopted faster, while products that require further development of an ecosystem can take longer to reach 10-percent adoption rates. Automobiles required a huge infrastructure of roads and gas stations.

The telephone required network economics, as the value of having a phone line was fairly low, for most users, when few other people had them. Likewise, supplying electricity required power plants and transmission lines, plus local power distribution networks.


And though it is a lesser point, as in some other markets, online commerce represents virtually 100 percent of the net growth.

Now consider a situation where multiple disruptive technologies are developing simultaneously, and where the net value which can be produced is an interaction between and among those technologies.

Big data now requires cloud computing. Big data will get bigger as internet of things sensors are widely deployed. So only artificial intelligence can sift through all the data to discern useful patterns.

And some of that data will have to be analyzed fast enough that edge computing is necessary. But 5G and other connectivity solutions will be needed to acquire all the data.

It is nearly impossible for a human to model all the possible interactions with enough detail to make the output useful. From a mobile operator’s point of view, it might be logical to put 5G at the center.

Other industries are going to put AI, or cloud, or IoT or big data at the center. No matter. The point is that the cluster of technologies is what really matters, not any single one of the technology trends.

In the past, it has been easier to model the impact of a single innovation (personal computer, mobile phone, internet). It will be much harder in the coming era, since so many fundamentally disruptive technologies are emerging at the same time.



Tuesday, October 31, 2017

FCC to Authorize Another 1700 MHz of New Millimeter Wave Spectrum

You can credit Moore’s Law for allowing commercial use of millimeter wave communications in the mass markets, bringing an order of magnitude (10 times) to two orders of magnitude (100 times) more usable mobile and fixed wireless capacity to the U.S. market.

The simple reason is that, in the past, it has been too expensive and difficult to use millimeter wave spectrum for access networks. But cheap processing and memory now mean we can process signals affordably enough to use millimeter wave frequencies in the access network.

But there are many other implications. So much new spectrum is coming, and so much unlicensed spectrum will be made available, that the economics of the access business will change.

The huge trove of new capacity, dwarfing all present mobile and Wi-Fi spectrum, will reshape the economics of the access business, allowing new competitors and business models to arise, revaluing spectrum licenses, enabling fixed wireless to compete with fixed networks and positioning mobile networks as full product substitutes for  fixed networks for the first time.

At its November 2017 meeting, the Federal Communications Commission will vote on an order that would make available 1,700 Megahertz of additional terrestrial wireless spectrum available for use, adding to the 11 Gigahertz of spectrum the FCC earlier had made available for flexible terrestrial wireless purposes, largely expected to support 5G use cases.

The additional 1700 Megahertz of high-band spectrum will be made available in the 24 GHz and 47 GHz bands.

As part of its Spectrum Frontiers initiative, the FCC already had started work to release new spectrum in the 28 GHz, 37 GHz, 39 GHz and 64 GHz to 71 GHz bands.

Though 3.85 GHz of that 11 GHz would be made available on a licensed basis, 7 GHz would be available to use on an unlicensed basis.

Spectrum in the 28 GHz, 37 GHz and 39 GHz bands (3.85 GHz total) represents more than four times the amount of flexible-use spectrum the FCC has licensed to date. In the 37 GHz and 39 GHz bands 200-MHz license areas would be created, with a total of 2.4 GHz available.

In the 28 GHz band, two 425 MHz spectrum blocks will be available, on a nationwide basis.

The 7 GHz of new unlicensed spectrum, combined with the existing high-band unlicensed spectrum (57-64 GHz), doubles the amount of high-band unlicensed spectrum to 14 GHz of contiguous unlicensed spectrum (57-71 GHz).

That 14 GHz band will be 15 times as much as all unlicensed Wi-Fi spectrum in the lower bands.

Also planned: shared access in the 37 GHz to 37.6 GHz band makes available 600 MHz of spectrum for dynamic shared access between different commercial users, and commercial and federal users.

Monday, October 30, 2017

37% of Comcast Customers Buy 3 or 4 Products

One set of numbers from the Comcast third quarter 2017 results is instructive: Comcast details the number of customers taking single, dual-product and three-product or four-product packages.

About 30 percent of consumer customers buy just one product, a third buy two products, while 37 percent buy purchase or four products.

That matters for revenue, as a buyer of multiple products produces more revenue than an account purchasing just a single product, all other things being equal (such as the unit price of each type of product).

More significantly, purchases of bundles determine the viability of the business case itself.

Stranded assets are big problems in highly-competitive fixed network markets. Assume a typical local market where the local telco and the local cable operator are equally skilled, but where cable has an advantage in the internet access and sometimes video product segments.

Assume a typical market share structure has the cable operator getting about 60 percent market share, while the telco gets about 40 percent.

That means the telco has no revenue generated from 60 percent of its passings, while cable gets no revenue from 40 percent of consumer locations passed.

But multiple service purchases effectively boost revenue as much as having more accounts in service. In other words, selling more things to fewer customers can produce as much revenue as selling one product to nearly every location.

And that is why take rates for multiple-product bundles matter so much: business models that would fail in a single-product environment  can work, even at significantly-lower rates of customer adoption.  

Comcast Account Segments (accounts in thousands)
Category
3Q 2016
3Q 2017
Net Adds 3Q
2016
Net Adds 3Q 2017
Residential Video Customers
21,420
21,341
19
(134)
Business Services Video Customers
1,007
1,049
13
9
Total Video Customers
22,428
22,390
32
(125)





Total High-Speed Internet Customers
24,316
25,519
330
214
Residential Voice Customers
10,527
10,351
(24)
(119)
Business Services Voice Customers
1,116
1,214
26
25
Total Voice Customers
11,643
11,565
2
(94)
Total Security and Automation Customers
815
1,079
78
51
Residential Customer Relationships
26,312
26,957
175
83
Business Services Customer Relationships
2,006
2,146
43
31
Total Customer Relationships
28,318
29,104
217
115
Single Product Residential Customers
7,722
8,055
51
125
Double Product Residential Customers
8,682
8,983
97
38
Triple and Quad Product Residential Customers
9,908
9,919
26
(79)

Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...