Friday, December 2, 2016

Are 600-MHz Auction Prices Indicative of Future Trends?

The U.S. Federal Communications, as part of planning for 5G services, is opening up nearly 11 GHz of new spectrum for mobile and fixed wireless broadband, including 3.85 GHz of licensed spectrum and 7 GHz of unlicensed spectrum, and is exploring additional allocations as well.

In addition, there are reasonable expectations that spectrum owned by Sprint, T-Mobile US and Dish Networks also will be available for acquisition (either by purchase of the firms or, in the case of Dish, a possible sale of airwaves).

That should lead potential bidders to adjust their expectations about the amounts they are willing to bid to acquire 600-MHz spectrum in the ongoing incentive auctions. Up to this point, through two rounds of bidding, bids have significantly lagged seller expectations. So it is not an idle spectrum to ask whether the value of spectrum now is changing radically.

In other words, spectrum value has to change, if supply increases so much, and if other methods are available to increase supply by using newer network architectures (small cells), using more-efficient radios and antennae and continuing to rely on unlicensed spectrum that carries no direct spectrum cost.  In fact, such trends suggesting lower spectrum valuation has been underway for a couple of years.

And that might be a thought process affecting spectrum value in other markets, from Egypt to India.

Consider just the expansion of supply in the U.S. market. The FCC already has announced it plans to release 11 gigahertz of new spectrum, including healthy amounts of unlicensed spectrum, and significant amounts of shared spectrum, in a couple of bands.

Licensed use in the 28 GHz, 37 GHz and 39 GHz bands makes available 3.85 GHz of licensed, flexible use spectrum, which is more than four times the amount of flexible use spectrum the FCC has licensed to date, for all mobile purposes.

Unlicensed use in the 64-71 GHz band makes available 7 GHz of unlicensed spectrum which, when combined with the existing high-band unlicensed spectrum (57-64 GHz), doubles the amount of high-band unlicensed spectrum to 14 GHz of contiguous unlicensed spectrum (57-71 GHz). That 14 GHz band will be 15 times as much as all unlicensed Wi-Fi spectrum in lower bands.

Shared access in the 37-37.6 GHz band makes available 600 MHz of spectrum for dynamic shared access between different commercial users, and commercial and federal users, extending shared spectrum access in the 3.5-GHz band.

Prices are based on supply and demand. If supply increases by an order of magnitude, and demand does not keep pace, wholesale and possibly retail prices will fall, as well.

Thursday, December 1, 2016

When Will Telecom Markets Stabilize? Not Soon.

Access markets tend, over time, towards relatively-stable oligopolies. The issue is whether, in the internet era, with new platforms and contestants, that can change. Certainly, in the near term--in fact for rather longish periods of time--markets will become unstable when disruptive new technology and new competitors enter a market.

The longer term issue is whether markets--after a period of time--adjust to the altered realities and reform, in stable form, in the form of oligopolies.

In many ways, it is too early to say.

Since the 1980s, when former state-owned telecom networks were privatized, and in the 1990s when new competitors were allowed to enter markets (both mobile and fixed), followed by the advent of the internet as a primary driver of business models and competition, accompanied by shifting end user demand, there literally has not been any respite in the amount of change.

Will there eventually be a period where disruptive change ceases to be such a prominent feature? Almost certainly. Still, for the foreseeable future (perhaps a decade or two), it seems highly unlikely that markets will reform in stable fashion.

There are many reasons for that state of affairs. We have not yet reached the conclusion of a massive change in access provider economics, where businesses driven by supplying voice services, messaging, internet access and video have either made a transition to substitute revenue drivers or consolidated to a point where available revenue and profit level are clearly sustainable for the remaining number of providers.

And few doubt that, for the most part, fewer providers (at least at the tier-one level) is the direction the markets are headed. In recent years, policymakers and regulators have debated whether  mobile market structure best provides both the benefits of competition and stimulates investment with three or four providers. Longer term, some might argue that only one provider or two actually can thrive, in some markets. Some believe that, in some markets, nationalization will be the only viable option.

The larger observation is that we are quite some ways from knowing what stable access provider markets look like, in terms of market structure.

Verizon One Fiber Shows Shift of Possibility in Consumer Access Markets

Some are skeptical about Verizon’s One Fiber plan for Boston, said to be a test of the new economics of FiOS. Skeptics say Verizon really will use One Fiber to provide the transport and distribution network for coming small cell deployments. That much is likely correct. The potential disagreement is what happens once Verizon has made those investments.

Verizon suggests the fiber network, in place and generating value for the small cell network and mobile side of the business, will in turn create better economics for deploying additional fiber to consumer neighborhoods. Some see the announcement Verizon made in April 2016 that it was restarting its optical fiber deployment plan for Boston as a case of “bait and switch,” arguing that Verizon implied full FTTH to the home construction.

To be fair, Verizon execs have talked about building a multi-purpose network supporting any number of uses, from IoT to mobility to consumer access. It might be fair to note that whatever various observers might have read into that One Fiber announcement, what Verizon plans is perhaps not a traditional FiOS  build, but a focus on the distribution network first, to support enterprise apps (fiber to tower and fiber to small cell sites to support mobile communications, IoT communications), also creating new economics for neighborhood deployment of FTTH.

In some ways, Verizon appears to have further adjusted its consumer access strategy using the “neighborhood” model pioneered by Google Fiber. Essentially, that new approach builds FTTH neighborhood by neighborhood, focusing by that means on improved business case outcomes in the early going.

The additional new thinking is that gigabit access (focusing on delivered bandwidth rather than access technology) will be a capability that can be delivered with using fiber to the home. Cable companies already sell gigabit internet access over hybrid fiber coax. Coming 5G mobile networks will, at least at first, also offer opportunity for fixed wireless at gigabit speeds.

So the big mental shift is towards consumer-received speeds and latency, not the access platform as such. Verizon likely believes (and many others tend to agree) that 5G, used to support fixed wireless, can deliver gigabit speeds to consumers. The One Fiber plan creates the infrastructure to do so.

In that sense, One Fiber does represent new investment that speaks directly to gigabit internet access for consumers, even if it does not necessarily always require FTTH construction.

That is among the many potential strategy choices Verizon and others can contemplate, without choosing fiber to home platforms on a ubiquitous basis, in all cases.

To be sure, NG-PON2  (Next-Generation Passive Optical Network 2) might help, as it specifies throughput of 40 Gbps, corresponding to up to 10 Gbps speeds for each subscriber. In a commercial sense, that might be “too much” bandwidth for today’s consumers and business models.

Even 1 Gbps is more bandwidth than most consumers or businesses can meaningfully use, beyond the simple observation that bandwidth-per-user is a real benefit.

One Fiber is the sort of shift in optical fiber deployment and consumer access business model that might escape proper evaluation if we remain fixated on access platform rather than consumer-delivered bandwidth. Bandwidth matters. How we deliver it does not, in the old way. After many decades of arguing over which fixed network access platform is “better,” or has better economics, we are moving into an era when platform choices can be quite varied, while still meeting the internet access business goals: selling more bandwidth, at a profit, to more subscribers.

Wednesday, November 30, 2016

Altice to Become Frist Major U.S. Cable TV Operator to Abandon HFC in Favor of FTTH

In a major break with other leading U.S cable TV providers across the United States, Altice USA, the fourth largest U.S. cable TV company, announced plans to switch to a new fiber-to-the-home
Network, appears ready to use proprietary technologies it has developed on its own, and also appears to believe that “energy cost savings” will be substantial enough to allow construction of the FTTH network “within the existing capital budget.”

Any one of those actions--abandoning the hybrid fiber coax platform; using its own proprietary platform; or building a brand new network without boosting its capital budget--would be unusual steps. Taking all three is mind-boggling.

Of the three decisions, it is the clear break with HFC that stands out most starkly. The cable TV industry has insisted for decades that HFC is an extensible platform capable of supporting all future requirements. And the industry has argued for many decades that its platform was, in fact, superior to FTTH, in terms of its business model. In other words, HFC would allow cable to deliver all services, and better services, without the capital expense of starting over with FTTH.

Altice is breaking decisively with HFC, and will be the first major cable TV operator to abandon HFC in favor of FTTH. The strategic implications are enormous. If Altice winds up being correct,
then perhaps HFC does not have the “legs” touted by its backers, even if 10 Gbps is on the cable industry industry HFC roadmap.

If Altice is correct--and DOCSIS and HFC really cannot support future bandwidth requirements--then there is at least a possibility that other cable TV operators will face unexpectedly-high capital investment requirements they are not now modeling, as they would have to build wholly-new networks, not simply upgrade edge and headend gear, as now is the case.

That would have implications for profit margins (lower), capital budgets (higher) and equity prices (probably lower).

Altice has plenty of experience with FTTH networks. Altice France is on track to reach 22 million fiber homes by the end of 2022, and Altice Portugal will reach the milestone of 5.3 million fiber homes passed by the end of 2020.

The five-year deployment schedule will begin in 2017, and the company expects to reach all of its Optimum footprint and most of its Suddenlink footprint during that timeframe, within five years.

Perhaps just as surprising, Altice expects to do so without a material change in its overall capital budget.

Some will be skeptical about one or more of the Altice claims. Some might argue Altice is right, long term, but maybe wrong near term. Comcast, for example, uses HFC for all consumer locations, but spot deploys an overlay fiber-to-home network for customers (business or consumer) who want to buy a symmetrical 2 Gbps internet access service.

In large part, that is a practical choice. Comcast does not immediately have a way to supply symmetrical 2-Gbps service over its HFC network, though it can supply asymmetrical 1-Gbps service over HFC.

The point--it will be argued--is that even if, at some point in the future, FTTH is necessary for consumer customers (no other major cable TV company has said this), HFC will continue to supply everything necessary for the foreseeable future.

The big danger of moving to FTTH, say HFC proponents, is over-investment that does not generate a reasonable financial return, for the intermediate future.

Nor are cable TV executives the only believers in many other ways of supplying bandwidth and internet access to consumer customers. AT&T, for example, seems to be a big believer in fixed wireless, and Verizon thinks fixed wireless will be the first commercial application for 5G networks in the United States.

Google and Facebook likewise are developing multiple new platforms using wireless access (balloons, unmanned aerial vehicles, fixed wireless, Wi-Fi, shared spectrum, possibly others).

For no other reason than that Altice now will become the first major U.S. cable TV firm to abandon HFC, and therefore calling into question the cable industry insistence that HFC essentially is future proof, the move to FTTH is noteworthy.

Some skeptics undoutedly will question the ability to build the new network without increasing capital budgets; the assumptions about operating cost savings; or the danger of using proprietary platforms.

Still, it is a history-making move.

India Mobile, Voice, Fixed Accounts Decline in July; Internet Access Grows

Though the trend is likely caused mostly by short-term developments, fixed network, mobile and voice accounts declined in India from July 2016 to August 2016, according to the Telecom Regulatory Authority of India. Many speculate that the currency retirement program now underway in India, retiring Rs 500 and Rs1000 notes, has many consumers focused on trading in their old currency, rather than buying new mobile phones and service, at least for the moment.  

That could change over the next several reporting periods, however, as Reliance Jio entered the mobile market in early September 2016, gaining 50 million new accounts. So the issue is what percentage of those net new accounts were taken from other suppliers (which would not affect overall net subscriber growth) and what percentage represents net new mobile accounts (people buying mobile service and not switching from another provider).

Internet access accounts (broadband) grew from 166.96 Million at the end of July to 171.71 million at the end of August, a monthly growth rate of 2.84 percent.

Total mobile accounts in service declined from 1,034.23 million at the end of July to 1,028.88 million at the end of August, a monthly decline rate of 0.52 percent.

The fixed network subscriber base declined from 24.62 million at the end of July to 24.51 million at the end of August, a net decline of 0.44 percent.



source: Trak

Did AT&T Divestiture and Telecom Act of 1996 Both Fail?

“Unintended consequences” might represent the more-significant of outcomes from the last two major transformations of U.S. telecom law. One might argue that happened because, despite best efforts, U.S. telecommunications law was backward-looking, something analogous to generals planning to “fight the last war.”

In retrospect, the biggest issue was the framing of problems. The breakup of the AT&T system--a historical anomaly, as it turned out--was designed to “solve” the problem of high long distance prices. The Telecommunications Act of 1996 was intended to “solve” the problem of high prices for local telecommunications services.

The 1983 divestiture completely missed the coming role of mobile services. In fact, mobile arguably had more to do with falling long distance prices than did competition among fixed network service providers.

A good argument can be made that the last two big revamps of U.S. telecommunications law were similar in one striking way: they were based on false or incorrect assumptions, and “failed.”

The 1982 “consent decree” that broke up the AT&T system, for example, attempted to create a competitive new telecom market by splitting long distance service from local telephone service, the theory being that competition for long distance voice services would be enhanced by creating seven new firms controlling local service, plus a deregulated AT&T restricted to long distance services.

The U.S. Department of Justice concluded in 2007 that divestiture did not work as expected, and that similar outcomes (much lower prices and much higher usage) would have been produced by less-complicated measures.

In other words, the big “missed” assumption was the rise of mobility as the primary way consumers would choose to use voice services. Nor, in retrospect, did the consent decree anticipate that mobile service providers would adopt pricing policies that eliminated the difference between a “local” and a “long distance” call. That “death of distance” pricing essentially killed the long distance business as a distinct industry segment, revenue and profit source.


The 1996 Telecommunications Act, likewise, was supposed to introduce local telecom competition, in the same way the the 1983 breakup of the Bell system was intended to spur competition in long distance voice.

The Act opened competition in the “local” telecom business, initially driven by mandatory wholesale policies that allowed new competitors wholesale access to existing facilities.

When those policies failed to produce investment in new facilities, was replaced by a reliance on “facilities-based competition. That policy, in turn, lead to the rise of cable TV providers as the ubiquitous telco alternatives in most markets.

A bigger “failure” was the belief that policy to promote competition essentially meant measures to support more competition for “local voice” services. That meant allowing new contestants to deploy voice switches and gain wholesale use of local access facilities owned by the dominant local telco.

What was missed? The internet. Ironically, to the extent the Telecommunications Act of 19996 has succeeded, it is because of value created by the internet and its apps and services, not new competition for local voice services.

The point is that, however well intentioned, major efforts to revamp communications policy have succeeded (in a generous interpretation)  “despite the new policies,” as much as “because of the new policies.”

It is the two major unintended developments--mobility and internet--that have lead to higher value and lower prices for consumers, not the intended changes (breaking up AT&T, opening local telecom to competition). In the case of the divestiture, policymakers missed mobility; in the case of the Telecom Act, they missed the internet.

The point is a huge dose of humility should be brought to the whole process of shaping policy to promote investment and competition in communications facilities and services. Our track record is not very good.

Tuesday, November 29, 2016

Will 2017 be the Year the Fixed Network Business Model Crashes?

Will 2017 be the year the global fixed network telecom business goes negative, or upside down, on a cash basis?

Yes, say researchers at the Economist Intelligence Unit. Their 2017 telecom forecast predicts that, by the end of 2017, the global fixed networks business will go negative. In other words, annual revenues will be less than investment required to operate the business.

That necessarily will start--or accelerate--a huge process of rethinking the role, scale and scope of fixed networks. Over the long term, fixed networks cannot be operated at a permanent loss, much less justify continual investment in higher speeds and capabilities, as revenue drops.

That calls into question not only the future role of a fixed network, its role and customers, but also the platform, marketing and operating costs required to sustain the business.

In simple terms, revenue no longer will cover fixed or variable costs in the business. That is a big big deal.



Zoom Wants to Become a "Digital Twin Equipped With Your Institutional Knowledge"

Perplexity and OpenAI hope to use artificial intelligence to challenge Google for search leadership. So Zoom says it will use AI to challen...