Sunday, May 27, 2018

Spectrum Supply and Demand are About to Go "Unstable"

It always is possible to get a robust debate about "whether enough spectrum is available." It might soon be possible to get a robust debate on whether spectrum prices will drop, based on increases in supply.

On one hand, demand keeps growing, so even if orders of magnitude new supply are added, supply and demand should remain in equilibrium, where prices are stable and supply matches demand, as economists like to say. 

On the other hand, many of you might look at your own experience in the communications business and not agree that the business is in equilibrium, and that applies to the value and price of acquired spectrum, as well as expectations about value and price as markets evolve. 

For example, though it is hard to place a financial value on mobile operator, business or consumer end user access to Wi-Fi, the ability to offload huge amounts of mobile phone internet access demand to Wi-Fi has a clear value to network operators who do not have to invest as much in capacity as they otherwise would have to do. 

In the U.S. mobile market, various participants also see multiple ways to increase the effective amount of capacity they can use. Buying new spectrum licenses is but one way to do so. In other cases a shift to small cells is seen as a reasonable alternative to acquiring new spectrum. That is basically Verizon's stated position.

In other cases, firms can buy companies that own licenses, as both Verizon and AT&T have done, and others might do. Dish Network has a trove of spectrum which must be put to commercial use or the licenses are lost. So many believe Dish ultimately will sell the licenses to a firm that can do so, fast. 

Mobile service provider mergers or acquisitions are another way to acquire additional spectrum.

But new techniques are coming, including the ability to aggregate unlicensed spectrum with licensed spectrum; access to shared spectrum that might cost less, or be accessible in unlicensed mode; plus huge increases in the amount of licensed and unlicensed spectrum available for mobile and other uses. 

So a good argument can be made that spectrum equilibrium is less likely. 

Recent spectrum auctions have diverged from expected sales values. In the past, mobile operators also have paid too much for spectrum.  

Recent U.S. spectrum auctions show mobile service providers being much more cautious about what they are willing to spend on buying spectrum licenses. The same trend was evident in recent spectrum auctions in India as well.  

In part, that is likely due to a perception that there are other ways of sourcing additional capacity, from aggregating unlicensed spectrum to use of smaller cells to shared spectrum or acquiring assets already awarded, but not yet in use. In some markets, spectrum trading also is a solution.  

But it also is possible that the perceived value of spectrum--still high--also has to match with expectations about the amount of revenue incremental spectrum can generate. If operators believe 100 new units will not drive the same amount of revenue as in the past, then their willingness to invest in spectrum will be less, on a per-unit basis.

Also, coming physical supply is disruptive, to say the least. All presently-licensed mobile spectrum, plus all Wi-Fi spectrum, plus new shared spectrum, amounts to about 2,600 MHz in the U.S. market. The actual mobile and Wi-Fi spectrum is closer to 800 MHz to 1,000 MHz.

But the Federal Communications Commission is releasing an order of magnitude more physical spectrum; much unlicensed; with possibly two orders of virtual capacity increases; plus spectrum sharing; plus small cells; plus better radios, is bound to be disruptive.

Supply and demand is at work, in other words. And if supply increases by

So how much will 5G change service provider spectrum valuation and asset models? Quite a lot. In fact, say consultants at Deloitte, “5G changes everything,” they say. That might be a bit of hyperbole, but the point is that there is greater uncertainty, for several reasons.

For starters, it is an underestimated fact that the value of spectrum licenses is part of the equity value of any public mobile service provider company.

Spectrum licenses account for “an average 35 percent of the assets of US WSPs (wireless service providers), and close to 20 percent of WSPs elsewhere, according to consultants at Deloitte.

But present valuations are assigned at original purchase value, and therefore might actually be different in an era of growing spectrum need and supply. At one level, the potential mismatch is easy to illustrate.

The value of assets for which an operator overpaid represents more value than similar assets for which an operator paid less, even if the assets acquired at lower cost might be equally, or more, valuable. So accounting “fiction” is at work.

Still, historically, rights to use mobile spectrum have been fundamental drivers of the ability to be in the business and earn revenue. But there are new questions in the 5G and coming eras, as the supply of spectrum (physical and virtual) is changing by orders of magnitude.

And how does one account for the value of being able to offload traffic to Wi-Fi? That avoided capital investment is worth something, but how much? And even if valuable, can it be reflected in an assessment of equity value?

Scarcity also matters. Historically, mobile spectrum has had value in two or more ways. It has been the necessary precondition for conducting business and satisfying demand. But it also has been a means of denying competitors access.

Licensed spectrum has been a driver of scarcity, and therefore equity value.

Deloitte argues the value of spectrum is presently undervalued. On the other hand, one might argue that so much new spectrum is coming, and the ways to use unlicensed spectrum also multiplying, that old rules of thumb about value and pricing do not work so predictably.

Cable operators, for example, clearly see lots of value in using their distributed public Wi-Fi nodes as infrastructure for their new mobile services. The “Wi-Fi first” access model does reduce either capex or wholesale capacity purchases or both.

And though the correlation is not linear, since mobile operators can increase capacity in other ways, the amount of spectrum a mobile operator can deploy is linked to the amount of revenue it earns. But each contestant has other assets to deploy (capital, brand, scale), so the relationship is not linear and causal.

In each market, some operators earn more revenue than others, for reasons including, but not limited to, the amount of spectrum they can deploy.

The point is that it is no clear whether spectrum presently is undervalued or not. The harder question is how to value such assets in the future, when the amount of supply--ignoring quality issues--is going to increase by an order of magnitude, and the effective capacity is going to increase by possibly two orders of magnitude.

Qualitative changes also will matter. Most internet of things apps will not require much bandwidth. And much bandwidth presently consumed across the backbone might in the future be cached and processed at the edge of the network. That will shift the bandwidth demand curve in significant ways.

On the other hand, if mobile networks are to challenge fixed networks as platforms for consumer internet access, then lots of cheap new bandwidth will be necessary, so mobile alternatives can offer comparable bandwidth and prices. Lower bandwidth costs are coming, in the mobile area, driven by platform improvements, more and more-efficient spectrum assets, use of small cells and shared, unlicensed and aggregated spectrum options.  

If mobile bandwidth traditionally has been an order of magnitude more expensive than fixed network bandwidth, then it is obvious that, to compete, mobile bandwidth has to be as capacious and affordable as fixed network bandwidth.


Up to this point, mobile cost per gigabyte has been as much as an order of magnitude more costly than fixed network cost per gigabyte. That is going to change.

Friday, May 25, 2018

No Demand for Gigabit?

Among the most-dangerous of statements is that something cannot be done, violates the laws of physics, costs too much, is too difficult to manage, or is not wanted by consumers or other users.

The reason such statements can be quite dangerous is that they are sometimes spectacularly wrong, in ways that dramatically affect whole indus

I can remember being at a meeting at the headquarters of the National Cable Television Association, in the earlier days of high definition television discussions, where it was proposed that a full HDTV signal could be squeezed from about 45 Mbps of raw bandwidth to the 6-MHz channelization used by the North American television industry.

The room essentially exploded, as the attendees, mostly vice presidents of engineering from the largest cable TV and broadcast firms, disagreed with the sheer physics of the proposal. Later, the executive who suggested HDTV in 6 MHz was indeed possible talked with his firm’s engineering vice president, about the the science, to reaffirm that such a thing actually could be done. “Are you sure about this?” was the question, given the magnitude of opposition.

To make a longer story short, it did prove feasible to compress a full HDTV signal into just 6 MHz of bandwidth, making for a much-easier financial transition to full HDTV broadcasting, as well as an ability for cable TV operators to support the new format.

Similarly, when the U.S. cable TV industry began to ask for analog optical transmission systems capable of carrying 20 channels of standard definition video without complicated channel-by-channel coding and decoding, a distinguished engineer from Bell Laboratories privately assured me that such a thing was in fact not possible, and that people who claimed it was possible were simply wrong.

To make a longer story short, it did indeed prove possible to take a full complement of analog video signals (40 channels, as it turned out), convert the full set of broadband signals to analog optical format, and deliver them over distances useful for cable TV purposes.

On another occasion, the vice president of one of the world’s biggest suppliers of equipment said privately that “digital subscriber line does not work” as a platform for high speed Internet access, even at relatively low speeds. Ultimately, that also proved incorrect. Over time, DSL performance was not only proven to be commercially viable, but also delivered much-faster speeds, over longer distances, as experience was gained.

The point is that when a smart, experienced, thoroughly-knowledgeable executive says that something “cannot be done,” one has to translate. What the statement means is only that, at a given point in time, before the application of effort and ingenuity, a given entity has not been able to do something.

That does not actually mean something literally “cannot be done.” Quite often, formerly impossible things actually are made possible, after dedicated investigation and development.

That applies to consumer demand for internet access, as well. It might well have been true two decades ago, or a decade ago, that there was no appreciable consumer demand for gigabit internet access, if such services could have been provided, and at retail prices those services would have cost, back then.

There was a time when the computing power (in constant dollar terms) of an Apple iPad 2, when Microsoft was founded in 1975, would have cost between US$100 million and $10 billion.

But when gigabit internet access is priced at a relatively modest premium to the standard offers (gigabit for $100, standard 200 Mbps for possibly $50 to $70), then there is much more demand.

“There is no demand for gigabit internet access” is a conditional statement. At some point, there will be lots of demand, on both mobile and fixed networks. All that has to happen is that price changes.

Comcast, for example, has increased the highest offered internet access speed at nearly Moore's Law rates. So the unstated qualifiers “at this price, at this time,” have to be kept in mind.

There will be a time, and a price, that has many, if not most, consumers buying gigabit rate internet access.

Global WANs Now Driven by Hyperscale Data Center Locations

Technology Predictions are Often Spectacularly Wrong

Nobel prize winner (in chemistry) and physicist Ernest Rutherford (known for his work on nuclear physics) once experimented with radio waves, but gave it up when told radio had no future.

The point is that even the best and brightest minds in technology often are very wrong about how a particular innovation will develop. So humility is not a bad attitude for any market researcher to adopt.

In 1943, Thomas Watson, IBM CEO said “I think there is a world market for maybe five computers.”

Ken Olson, Digital Equipment Corp. president, in 1977 said “there is no reason for any individual to have a computer in his home.”

Western Union execs once argued the telephone “is inherently of no value to us.”

Thomas Edison said “fooling around with alternating current is just a waste of time.”

Others argued the automobile was a novelty or fad. Studio executive Daryl Zanuck once said “television won’t be able to hold on to any market it captures after the first six months.”

Even Marty Cooper, a mobile phone pioneer, once argued that “cellular phones will absolutely not replace local wire systems.”

Robert Metcalfe, a father of Ethernet, said “the internet will soon go spectacularly supernova and in 1996 catastrophically collapse.

Steve Ballmer, Microsoft CEO, once said “there’s no chance that the iPhone is going to get any significant market share.”

The point is that the best and brightest among us, and certainly even competent market researchers, can be spectacularly wrong. That might be especially the case when subjects we do not routinely cover start to reshape industries, markets and possibilities, but such dangers always exist, even in markets we do claim to cover.

“You don’t know what you don’t know” is one explanation for forecasting error. No forecaster is able to incorporate somewhat unplanned events such as major recessions, huge changes in underlying core technologies, wars, revolutions and other exogenous events that directly affect markets of any sort.

Also, we tend to work with two-dimensional spreadsheets. Reality necessarily is more complex than that. So humility always is a good attitude to maintain when making predictions.

Why Software-Defined Networks are Destined to be More Important

There is a good reason why software-defined networks are getting more attention. Simply, all business and consumer networking use cases are becoming “cloud-based” interactions that are themselves largely virtualized.  

For enterprises and businesses, that tends to mean a shift towards virtual private networks supporting mobile, remote and distributed users who require access to cloud-based computing resources, no matter where they are working. That also implies a need for security and quality of service support provided by VPNs.


So a reasonable person might conclude that the shift in enterprise networking will be in the direction of software-defined networking (SD-WANs, VPNs, virtualized core networks, network slicing).

Call Center Experience the Weakest Part of Service Provider Performance

Unhappiness with call center interactions seems to be the key reason for low customer satisfaction scores with linear video subscription services (no direct question about price or value-price relationship is asked).

And though fixed-line voice service ranks higher, the same pattern of unhappiness with call center experience occurs there. Call center experience also is the lowest-ranked feature of mobile satisfaction tracked by ACSI.

I cannot remember a time in three decades when cable TV services got high satisfaction ratings. Nor, since the American Customer Satisfaction Index (ACSI) began tracking internet service providers, do I recall ever seeing high satisfaction ratings for internet access service, either.

In fact,  “subscription television and internet service providers rank last among all industries tracked by the ACSI,” a placement that has been consistent for several years.

It appears much of the problem lies with customer interaction and customer support, especially the use of call centers.

Mobile phone services, on the other hand, score much higher, though not as high as the devices themselves. Fixed network phone service, perhaps paradoxically, not scores higher satisfaction than subscription TV or ISP service. The likely explanation there is that all the unhappy customers have left.

The 2018 annual ACSI report on such services only confirms the trend.


Customer satisfaction with subscription television service fell 3.1 percent to an ACSI score of 62, an 11-year low, ACSI says.

New indices for video on demand and video streaming initially show higher scores, with video streaming services on par with mobile phones.  



Thursday, May 24, 2018

Moore's Law Really Does Matter

Moore’s Law and optical fiber matter, where it comes to fixed network internet access speeds.

Back in the early 1980s, when I first got into the cable TV business, many rural systems were operating at less than 200 MHz of total analog bandwidth, the first big city franchises were about to be awarded, and the state of the art was systems promised to operate at 400 to 450 MHz.

All that was before optical fiber and the hybrid fiber coax architecture, the need for reliable two-way communications or data services.

Because of Moore’s Law advances and optical fiber, HFC physical bandwidth now pushes between 750 MHz and a gigaHertz and internet services now push between hundreds of megabits per second and a gigabit per second, using DOCSIS 3.1.


It is possible, perhaps likely, that bandwidth will grow further beyond planned improvements to DOCSIS 3.1.
Indeed there is early speculation about what might be possible with next-generation DOCSIS that harnesses new spectrum ranges. Other proposed ways of increasing symmetrical bandwidth require all-fiber networks and full-duplex networking, where the same bandwidth is used for both upstream and downstream communications.

The point is that advances in computing power, with lower prices, plus optical fiber, make possible amounts of commercial bandwidth that would have been unthinkable back in the early 1980s.

Nobody Knows "How Many" Facilities-Based Telcos Can Exist in a Mature Market

In most fixed network telecom markets, the reality is that only a single facilities supplier is financially sustainable on a national basis, so competition usually takes the form of wholesale obligations. Mobile markets historically have featured at least two to four facilities-based competitors.


But as in the fixed network market, there are questions about sustainable numbers of contestants as the market matures. Over time, fewer competitors are generally expected.


The big issue for regulators is how few competitors are required to provide the benefits of competition, but on a sustainable basis. And that answer is not yet known.


“The idea that the U.S. mobile market has an equilibrium of four firms (nationally, at least) is an emotional and not a scientific conclusion,” said George Ford, Phoenix Center for Advanced Legal and Economic Policy Studies chief economist.


In other words, four national providers might not be sustainable. That view is supported, Ford argues is entirely consistent with the financial struggles of Sprint and T-Mobile US. Even Arcep, the French regulator, now hints that it might allow consolidation in the French mobile market that it long has resisted.


Still, the U.S. Department of Justice said in 2011 that the transition from four to three mobile mobile providers in the U.S. would constitute an unacceptable reduction in the number of competitors.


That combination of AT&T and T-Mobile US would have raised market concentration scores on the Hirschman-Herfindahl Index (HHI) by more than 400 points, a level guaranteed to raise antitrust scrutiny.


Such a score is not an absolute barrier to any particular merger, but places a strong burden on the proponents to show why the merger is not anticompetitive.


Though what is the market? Is not a big question for regulators who will look at the Sprint merger with T-Mobile US, there are going to be bigger questions for some of the other possible mergers, starting with AT&T and Time Warner.

Such foundational questions about the relevant market also are likely to be an issue faced by regulators, if they look at market concentration in application markets lead by the likes of Google, Facebook and Amazon, in the future.

Talk Talk Sales Show Value of Indirect Channel

There is reason why many products are sold using indirect (partner) channels. This chart by Talk Talk in the United Kingdom illustrates the basic economics. Talk Talk sells 83 percent of its products wholesale, using indirect channels (channel partners).

The traditional reason for using indirect channels is that a supplier cannot afford to sell direct to customer segments targeted by the channel partners.

In fact, Talk Talk reports that earnings (cash flow) from the indirect channels used to sell to consumers are about the same as cash flow from the business customer segment which is sold using a direct sales force.

source: Talk Talk

Wednesday, May 23, 2018

Why 4K/8K TV is a Waste for Most People

For most consumers, 4K and 8K TVs are unlikely to provide an actual experience boost, despite the denser pixel count. The reason is that the human eye cannot tell the difference between 4K and 8K from 1080p unless a person sits uncomfortably close to a screen, or unless the screen is really huge. Simply put, 4K is a waste of money, as 8K will be, for most people.

Most people simply do not sit close enough to the screen to perceive the difference 4K or 8K can provide.  

It is obvious why consumer electronics companies want to sell you new TVs. TVs no longer break, and manufacturers need new reasons for you to buy a new screen and move the existing screen to a bedroom or elsewhere in a house.

Content developers have their own reasons for wanting higher resolution: it is part of the decades-long effort to create greater realism and experiential immersion.


The trend to bigger screens therefore makes sense. Either people have to move closer to their screens, or screens have to get much bigger. Bigger screens probably are the only realistic option.

But 4K and 8K really make sense for business, medical, industrial and other applications where a human operator actually is very close to a screen with very-rich detail.

Tuesday, May 22, 2018

Is Proposed Hillsboro, Ore. Municipal ISP Network Viable?

Building a $66 million, municipal ISP network would be "marginally" viable at a 28 percent "take" rate, a study by Colorado firm Uptown Services predicted in 2015.

That might be an optimistic expectation of market share for any well-run ISP operating a fiber-to-home network and competing against a elco and a cable TV operator where one or both of those competitors are vulnerable because they have not, or cannot, invest in their own networks.

Much hinges on whether the Hillsboro network plans also to sell video service or voice. If not, actual take rates might be as low as 20 percent, and possibly lower.

Many municipal ISPs that report adoption rates (penetration, or the percent of homes passed that actually buy service) boosted by their sales of video and voice services. So the adoption rate is based on “units of service sold,” not the “number of homes buying service.”

At least so far, where a municipal ISP offers only internet access, early adoption rates--even with highly-competitive prices, have been in single digits.

Penetration: Units Sold or Homes Buying Service?

Morristown
Chattanooga
Bristol
Cedar Falls
Longmont
homes passed
14500
140000
16800
15000
4000
subscribers
5600
70000
12700
13000
500
units sold
39%
50%
76%
87%
13%
services sold
3
3
5
3
2
HH buys .66 =
2
2
3
2
1
Homes served
2828
35354
3848
6566
379
penetration
20%
25%
23%
44%
9%

Some private ISPs would, and have, taken such a chances. Numerous cities and towns seem to be considering the option, as well.

The consultants estimate the Hillsboro municipal ISP operation would reach cash positive operations in 13 or 14 years, using the $50 per month benchmark. That might be too optimistic. Higher prices seem to part of the business model for other municipal broadband networks.  

But city officials have decided to build the municipal broadband network anyway. It will not be easy.

Municipal ISPs enjoy no advantages in capital investment and perhaps marginal advantages in the make-ready and pole attachment cost areas. Any hope for enough operating efficiencies to sell service at $50 a month would presumably have to come in marketing and operating cost areas comparable to best practices seen at some private ISPs (Sonic, Tucows).

If successful, such networks generally result in lower prices, to be sure. But the proposed Hillsboro network might be seen as a key test of whether such networks can compete in suburban markets.

Traditionally, the opportunity for municipal broadband has seemed more realistic in rural markets and for smaller towns. The Hillsboro network might be likened to the network Ting is building in Centennial, Colo., a reasonably prosperous suburb of Denver.

Hillsboro possibly will be an important test case of the business model. Few private investors would be able to wait more than a decade simply to reach cash flow positive status, to say nothing of earning enough money to earn a profit after two decades or so.

On the Use and Misuse of Principles, Theorems and Concepts

When financial commentators compile lists of "potential black swans," they misunderstand the concept. As explained by Taleb Nasim ...