Friday, December 2, 2016

Top U.S. Internet AccessHeadline Speeds Grew 10X Last 5 Years

Over the nearly-five-year period between 2011 and 2015, maximum advertised internet access speeds of the most-popular service tiers offered by U.S. internet access providers have increased from “12 Mbps to 30 Mbps” in March 2011  to “100-300 Mbps” in September 2015, a factor of 10 increase in less than five years.

Two important elements of that report are important.

Some might object that such “maximum advertised” speeds are not matched by actual end user experience. The Federal Communications Commission report on the subject says actual download speeds are 100 percent of advertised speeds, if not better.

But there is an important distinction: speeds improved the most on cable TV networks. In fact, some will rightly note that virtually the entire increase was driven by hybrid fiber coax networks operated by cable companies.

“While fiber based systems continue to have the highest weighted median speeds, cable based ISPs are driving the growth in new high speed service tiers,” the report states. In fact “the maximum advertised download speeds among the most popular service tiers offered by ISPs using cable technologies have increased from 12-30 Mbps in March 2011 to 100-300 Mbps in September 2015.”

“In contrast, the maximum advertised download speeds that were tested among the most popular service tiers offered by ISPs using DSL technology have, with some exceptions, changed little since 2011,” the FCC report notes.

Most popular advertised service tiers
Platform
Company
Speed Tiers (Download)
Speed Tiers (Upload)
DSL
AT&T DSL
1.5*
3
6
0.384
0.512
AT&T IPBB
3
6
12
18
24
45
0.384*
0.512
0.768
1
1.5
3
6
CenturyLink
1.5
3
7*
10
12
20
40
0.512
0.768
0.896
5
Frontier DSL
1
3
6
0.384
0.768
Verizon DSL
(0.5 -1)*
1.3-3
0.384 *
(0.384 - 0.768)
0.768*
Windstream
3
6
12
0.768
Cable
Optimum
25
50
101
5
25
35
Charter
60
100
4
Comcast
25
50
75
105
150
0.768
5
10
20
Cox
15
25
50
100
2
5
10
Mediacom
15
50
100
1
5
10
Time Warner Cable
15
20
30
50
100*
300
1
2
5
10*
20
Fiber
Frontier Fiber
25
5
10
25*
Verizon Fiber
25
50
75
25
35
50
75
Satellite
Hughes
5
10
1
ViaSat
12
3



Median speeds also are important, as they reflect the sort of “typical speeds” used by consumers. The FCC report says that “the median download speed, averaged across all participating ISPs, has almost quadrupled during this period, from approximately 10 Mbps in March 2011, to approximately 41 Mbps in September 2015.”

The Attackers Advantage

In any competitive market, attackers often have some advantages, compared to defenders. In some cases, attackers benefit from “pro-competitive” regulatory or fiscal policies that offer new entrants the ability to compete with lower costs. Attackers often do not have universal service obligations, and can cherry pick their potential customer segments and geographies.

Those advantages often are magnified when the market is declining, rather growing, but remains large in terms of gross revenue. Under such conditions, defenders often face huge stranded asset problems, while attackers can spot deploy or incrementally deploy capital as required.

Simply, at attacker often can build a business model that allows it to grow revenues and customers even in a declining market, by taking market share. Legacy providers generally only can lose share, with a business model that gets worse with each lost account.

That, in a nutshell, illustrates why attacking service providers in the U.S. market, such as Comcast and Charter Communications, are better positioned, strategically, then telcos. They are trading market share with telcos in video entertainment, to be sure. But the rate of decline has been quite restrained, so far, and the video market is far smaller than the core telecom markets.

So cable faces marginal losses in a smaller segment--video--and big potential gains in the vastly-larger core communications market. And, in one important product segment--internet access--telcos have not found a way to remain competitive.

Roughly speaking, annual core telecommunications revenue is something on the order of $330 billion each year. Linear video is about $100 billion a year. Rough math: 30 percent of $330 billion is $99 billion; 30 percent of $100 billion is $30 billion. If telcos and cable companies trade market share at equal rates, cable gains an order of magnitude more revenue than do telcos.

In the important internet access business, cable already dominates. Since about 2007, U.S. telcos have steadily lost market share in internet access services, with cable now getting all the net new additions.

That explains why telcos are so focused on discovering and creating big new revenue sources, and why some attackers are well positioned, compared to telco defenders.
source: Pyramid Research

Odds of Telco Failure are Growing

One graph shows the tension in fixed network business models, globally. By the end of 2017, telecom service provider revenue will potentially be lower than the amount of capital investment being made in those networks. Presumably, if one adds the operating and overhead costs, many service providers are basically “under water” in terms of their business models, which is to say, “failing.”

That is a primary reason why the fundamental strategic challenge for any “telco” or legacy access provider is to replace nearly all of the existing revenue sources with new sources. That would be a huge challenge for the best-placed, best-managed firms in any industry.

The odds of failure continue to grow.

source: The Economist

Are 600-MHz Auction Prices Indicative of Future Trends?

The U.S. Federal Communications, as part of planning for 5G services, is opening up nearly 11 GHz of new spectrum for mobile and fixed wireless broadband, including 3.85 GHz of licensed spectrum and 7 GHz of unlicensed spectrum, and is exploring additional allocations as well.

In addition, there are reasonable expectations that spectrum owned by Sprint, T-Mobile US and Dish Networks also will be available for acquisition (either by purchase of the firms or, in the case of Dish, a possible sale of airwaves).

That should lead potential bidders to adjust their expectations about the amounts they are willing to bid to acquire 600-MHz spectrum in the ongoing incentive auctions. Up to this point, through two rounds of bidding, bids have significantly lagged seller expectations. So it is not an idle spectrum to ask whether the value of spectrum now is changing radically.

In other words, spectrum value has to change, if supply increases so much, and if other methods are available to increase supply by using newer network architectures (small cells), using more-efficient radios and antennae and continuing to rely on unlicensed spectrum that carries no direct spectrum cost.  In fact, such trends suggesting lower spectrum valuation has been underway for a couple of years.

And that might be a thought process affecting spectrum value in other markets, from Egypt to India.

Consider just the expansion of supply in the U.S. market. The FCC already has announced it plans to release 11 gigahertz of new spectrum, including healthy amounts of unlicensed spectrum, and significant amounts of shared spectrum, in a couple of bands.

Licensed use in the 28 GHz, 37 GHz and 39 GHz bands makes available 3.85 GHz of licensed, flexible use spectrum, which is more than four times the amount of flexible use spectrum the FCC has licensed to date, for all mobile purposes.

Unlicensed use in the 64-71 GHz band makes available 7 GHz of unlicensed spectrum which, when combined with the existing high-band unlicensed spectrum (57-64 GHz), doubles the amount of high-band unlicensed spectrum to 14 GHz of contiguous unlicensed spectrum (57-71 GHz). That 14 GHz band will be 15 times as much as all unlicensed Wi-Fi spectrum in lower bands.

Shared access in the 37-37.6 GHz band makes available 600 MHz of spectrum for dynamic shared access between different commercial users, and commercial and federal users, extending shared spectrum access in the 3.5-GHz band.

Prices are based on supply and demand. If supply increases by an order of magnitude, and demand does not keep pace, wholesale and possibly retail prices will fall, as well.

Thursday, December 1, 2016

When Will Telecom Markets Stabilize? Not Soon.

Access markets tend, over time, towards relatively-stable oligopolies. The issue is whether, in the internet era, with new platforms and contestants, that can change. Certainly, in the near term--in fact for rather longish periods of time--markets will become unstable when disruptive new technology and new competitors enter a market.

The longer term issue is whether markets--after a period of time--adjust to the altered realities and reform, in stable form, in the form of oligopolies.

In many ways, it is too early to say.

Since the 1980s, when former state-owned telecom networks were privatized, and in the 1990s when new competitors were allowed to enter markets (both mobile and fixed), followed by the advent of the internet as a primary driver of business models and competition, accompanied by shifting end user demand, there literally has not been any respite in the amount of change.

Will there eventually be a period where disruptive change ceases to be such a prominent feature? Almost certainly. Still, for the foreseeable future (perhaps a decade or two), it seems highly unlikely that markets will reform in stable fashion.

There are many reasons for that state of affairs. We have not yet reached the conclusion of a massive change in access provider economics, where businesses driven by supplying voice services, messaging, internet access and video have either made a transition to substitute revenue drivers or consolidated to a point where available revenue and profit level are clearly sustainable for the remaining number of providers.

And few doubt that, for the most part, fewer providers (at least at the tier-one level) is the direction the markets are headed. In recent years, policymakers and regulators have debated whether  mobile market structure best provides both the benefits of competition and stimulates investment with three or four providers. Longer term, some might argue that only one provider or two actually can thrive, in some markets. Some believe that, in some markets, nationalization will be the only viable option.

The larger observation is that we are quite some ways from knowing what stable access provider markets look like, in terms of market structure.

Verizon One Fiber Shows Shift of Possibility in Consumer Access Markets

Some are skeptical about Verizon’s One Fiber plan for Boston, said to be a test of the new economics of FiOS. Skeptics say Verizon really will use One Fiber to provide the transport and distribution network for coming small cell deployments. That much is likely correct. The potential disagreement is what happens once Verizon has made those investments.

Verizon suggests the fiber network, in place and generating value for the small cell network and mobile side of the business, will in turn create better economics for deploying additional fiber to consumer neighborhoods. Some see the announcement Verizon made in April 2016 that it was restarting its optical fiber deployment plan for Boston as a case of “bait and switch,” arguing that Verizon implied full FTTH to the home construction.

To be fair, Verizon execs have talked about building a multi-purpose network supporting any number of uses, from IoT to mobility to consumer access. It might be fair to note that whatever various observers might have read into that One Fiber announcement, what Verizon plans is perhaps not a traditional FiOS  build, but a focus on the distribution network first, to support enterprise apps (fiber to tower and fiber to small cell sites to support mobile communications, IoT communications), also creating new economics for neighborhood deployment of FTTH.

In some ways, Verizon appears to have further adjusted its consumer access strategy using the “neighborhood” model pioneered by Google Fiber. Essentially, that new approach builds FTTH neighborhood by neighborhood, focusing by that means on improved business case outcomes in the early going.

The additional new thinking is that gigabit access (focusing on delivered bandwidth rather than access technology) will be a capability that can be delivered with using fiber to the home. Cable companies already sell gigabit internet access over hybrid fiber coax. Coming 5G mobile networks will, at least at first, also offer opportunity for fixed wireless at gigabit speeds.

So the big mental shift is towards consumer-received speeds and latency, not the access platform as such. Verizon likely believes (and many others tend to agree) that 5G, used to support fixed wireless, can deliver gigabit speeds to consumers. The One Fiber plan creates the infrastructure to do so.

In that sense, One Fiber does represent new investment that speaks directly to gigabit internet access for consumers, even if it does not necessarily always require FTTH construction.

That is among the many potential strategy choices Verizon and others can contemplate, without choosing fiber to home platforms on a ubiquitous basis, in all cases.

To be sure, NG-PON2  (Next-Generation Passive Optical Network 2) might help, as it specifies throughput of 40 Gbps, corresponding to up to 10 Gbps speeds for each subscriber. In a commercial sense, that might be “too much” bandwidth for today’s consumers and business models.

Even 1 Gbps is more bandwidth than most consumers or businesses can meaningfully use, beyond the simple observation that bandwidth-per-user is a real benefit.

One Fiber is the sort of shift in optical fiber deployment and consumer access business model that might escape proper evaluation if we remain fixated on access platform rather than consumer-delivered bandwidth. Bandwidth matters. How we deliver it does not, in the old way. After many decades of arguing over which fixed network access platform is “better,” or has better economics, we are moving into an era when platform choices can be quite varied, while still meeting the internet access business goals: selling more bandwidth, at a profit, to more subscribers.

"Tokens" are the New "FLOPS," "MIPS" or "Gbps"

Modern computing has some virtually-universal reference metrics. For Gemini 1.5 and other large language models, tokens are a basic measure...