Thursday, June 1, 2017

40 Years of Differences

In January 1978, when the first Pacific Telecommunications Council conference was held, the world was quite different.

  • Fewer than 7% of the world’s people had telephone service
  • Telecom was a monopoly and most firms were government owned
  • Nobody used a mobile phone
  • There was no Internet, no Ethernet, no browsers
  • 82 analog voice circuits connected Hawaii and Australia/New Zealand
  • Modems were acoustic and operated at 300 bps
  • Global telecom revenue and profit was driven by voice, especially long distance
  • “Billions” of people had never made a phone call
  • The business model was simple: build networks, earn a guaranteed return

Now celebrating its 40th anniversary, we all live in a world where:

  • Usage has migrated from voice to data to video
  • Bandwidth routinely is measured in terabits per second
  • There are 7.9 billion mobile phone accounts, used by 4.8 billion people
  • Telecom is part of the internet and computing ecosystems
  • Most telecom markets are fiercely competitive
  • All legacy revenue streams are under pressure, and new revenue models must be created
  • Cloud computing, OTT, 5G, smart cities and internet of things are top of mind
  • The business model is anything but certain,and every legacy service is mature or soon to be mature

U.S. Ranks 10th for Mobile Internet Speed: Why That is Not a Problem

Less often than in the past, one hears it said that the United States has a broadband problem. Costs are said to be too high, speeds too low, choice inadequate. That is true, in some locations, to some extent.

At times over the past decade, it has been argued that, where it comes to fixed network internet access, the United States was “behind” in either coverage, usage or speed.

The digital divide these days continues to be an issue in rural areas, but arguably is more complicated an issue since some users prefer mobile-only access and some people say they do not use the internet because they do not wish to do so.

Some might also argue that the way people and nations use the internet also matters, not simply availability, price or speed.

International comparisons can be instructive, though sometimes not for the reasons one suspects. Consider voice adoption, where the best the United States ever ranked was about 15th, among nations of the world, for teledensity.

For the most part, nobody really seemed to think that ranking, rather than higher on the list, was a big problem, for several reasons. Coverage always is tougher for continents than for city states or small countries. Also, coverage always is easier for dense urban areas than rural areas. The United States, like some other countries (Canada, Australia, Russia) have vast areas of low population density where infrastructure is very costly.

On virtually any measure of service adoption (voice or fixed network broadband, for example), it will be difficult for a continent-sized market, with huge rural areas and lower density, to reach the very-highest ranks of coverage.

That remains the case for mobile internet coverage or mobile internet average speeds, where, according to Akamai, the United States ranks about 10th.

source: Akamai

Will Edge Computing and Low-Latency Services Allow ISPs to Move Up the Stack?

Most moves made by most tier-one telcos “up the stack” have not worked well, if at all, and that include early moves into computing, data center operations, app stores, appliances and devices, over the top voice and messaging apps, or even OTT video services.

The jury still is out on moves into banking services, mobile advertising and content, but many telcos have fared rather well in the linear video subscription areas.

And though it is a statement of vision more than a practical reality at the moment, AT&T believes that, with a move to pervasive computing (which is one way to describe what “internet of things” is about), there is an inherent ability to embed higher-value operations into the network.

“The network itself moves from a connection to an experience that can include the compute,” said John Donovan, AT&T chief strategy officer.

In other words, even if data warehouses generally have proven to have modest strategic value for access providers (telcos and other access providers), that might well change as services and apps are created that rely on edge computing support.

As a horizontal business model, edge computing support could emerge as an area where telcos and other access providers might actually have some advantages, such as dense networks, access to power, other real estate and network elements that could play a role in supplying edge compute services to third parties.

Consider other potential advantages. AT&T’s new AirGig platform, for example, offers the promise of affordable trunking anywhere above ground where there are power utility poles and transmission lines.

Even if that is not so crucial for urban areas where access providers already have easements, pole attachment rights and access to power, AirGig might well play an important role in rural areas, where the cost of networking and bandwidth has always been tougher.

“For us it's a game changer on a cost basis because the components are small, sample and plastic performance wise,” said Donovan.

In other words, in addition to the “connections” function, there is a logical role either at the applications layer or in the computing layer.

That is not to say the task will be easy. It will be hard. But it is possible, and could prove to be among the more-successful ways telcos can move up the value chain.


CBRS Needs Certainty, Firms Say

Telecom and all other firms generally hate uncertainty. So a call for keeping in place rules for Citizens Broadband Radio Service, and it's approach to spectrum sharing, is important, industry suppliers say  

How to Move Up the Stack, How Not To

Telcos have been trying to “move up the stack” into application layer businesses for quite some time, with very mixed success. Computing firm NCR was acquired by AT&T in 1991, for example, in an effort to create a vertically-integrated computing capability. That effort failed, and NCR was eventually spun off.

That might be one key to how at least some tier-one telcos might look at their moves up the stack. Consider the different way Comcast has used its NBC Universal assets, and how AT&T must use its Time Warner assets.

Comcast did not try to make NBCUniversal (could not, for legal reasons) an “exclusive” or “vertically-integrated” asset available only to its owned cable TV systems. In other words, NBCUniversal was not about vertically integrating the content and making it proprietary to Comcast.

Instead, the value of NBCUniversal content is that it is sold to all other U.S. linear video subscription providers, even if some of that content is used in a proprietary way at the theme parks.

Likewise, AT&T will find its Time Warner content being sold (by law) to all other linear subscription providers, and eventually, in other ways, to over the top services. Likewise, AT&T would have little to no interest in restricting distribution of its studio content (movies) only through AT&T distribution assets. Instead, it would want continued distribution as widely as possible.

The point is that what has worked in the linear video space is not vertical integration, but rather broad sales to direct competitors, who serve customers that want the content.

In other words, instead of vertical integration that seeks uniqueness, content assets are broadly attractive to all suppliers in the linear video business, though also used in a vertical way--as an input--to support AT&T’s own linear video and OTT video operations.

In the internet of things area, a similar approach might be the right way to operate as well. Instead of acquiring or growing assets that are “captive” to AT&T, a better approach might be to create or acquire assets of broad value to customers and competitors.

The other approach--capturing the benefits internally and uniquely within AT&T--might not prove as successful, ultimately.

Most telco VoIP or OTT messaging efforts have failed. One commonality: those efforts were “branded” alternatives to other OTT or VoIP services. In other words, those were attempts to vertically integrate and restrict use of the services only to customers of telco access services.

The opposite approach is taken by wildly-successful consumer apps and appliances such as Google, Facebook, Amazon, Netflix or Apple. Those apps and devices work on all networks, and are not captive to any single access provider.

All that suggests the fruitfulness of seeking assets in IoT that are valuable using any access network, not specific features of a single provider’s access service.

source: Telco 2.0

Wednesday, May 31, 2017

Internet Trends Report: It's Now All About the Apps, Content, Games, Advertising

At a high level, the biggest takeaway from this year's Internet Trends presentation by Mary Meeker is how much she focuses on apps, games, content and advertising, not access. 

Access is not so much the issue anymore. And "access" is mostly a matter of smartphones. Meeker also spends quite a lot of time on two markets: India and China. 


Tuesday, May 30, 2017

Spectrum Abundance Might Change Everything

Business models in the telecom business have tended to change slowly. But everyone likely would agree that the pace of change now is much faster. And the biggest single assumption about telecom business models has been scarcity.

In the past, there was--by law--no competition allowed. That fundamental assumption underpinned the whole business model. But we no longer believe telecom is a "natural monopoly" (just one supplier is sustainable).

Most would agree it probably is an oligopoly (a few providers). But even that assumption is going to be challenged over the coming decade, as spectrum scarcity becomes relative or absolute spectrum abundance.

The global mobile business, like its predecessor fixed line networks business, was built on scarcity. Monopoly regulation created scarcity by policy decision, allowing only one firm to lawfully provide telecom services in a country. The whole business model therefore was shaped by the deliberate lack of competition.

But there are other forms of scarcity. Even in the competitive era, fixed networks are expensive, capital-intensive undertakings that necessarily limit the number of sustainable providers (some would still say the empirical evidence is that some markets can support only one facilities-based provider; two providers in some markets and possibly three contestants in parts of national markets.

In mobile markets it presently seems possible to support more than one facilities-based provider in every market, though observers disagree about the number of indefinitely-sustainable contestants owning their own facilities.

But mobile network business models also have been built on scarcity of the policy sort, namely by the reliance on licensed rights to spectrum. As virtually anybody would acknowledge, traditionally, such spectrum has been valuable because it has been scarce.

The issue now is whether “scarcity” conditions will continue to define most business models.
And that is open to question.

“Since the 1920s, regulators have assumed that new transmitters will interfere with other uses of the radio spectrum, leading to the ‘doctrine of spectrum scarcity,’” said IEEE Spectrum authors Gregory Staple and Kevin Werbach, over a decade ago.

What would change? Spectrum sharing and, to a lesser extent, new spectrum, it was believed. What is different now, more than a decade later, is that ability to use vast amounts of new spectrum in the millimeter wave region, something most would have thought either impossible or unlikely, in the past.

Even though industry executives and regulators “always” have considered spectrum a scarce resource, that is “not so.” Rights to use spectrum has been the scarce ingredient, a major assumption upon which the business model is built.

Also, the traditional reason for such licensing was to prevent signal interference. But “interference” is a function of device and transmitter performance, not simply the number of simultaneous users. Moore’s Law advances mean that signal processing capabilities are far more sophisticated than was possible in the analog or even earlier digital realms.

There are two huge implications. First, the spectrum portfolios of large cellular phone companies will certainly be devalued. Second, scarcity will not provide a “business moat” around suppliers, as it once did. New competitors will be able to enter the business, simply because the barrier of having “rights to use spectrum” is falling.


The initial signs are that coming spectrum abundance already is having an impact on spectrum prices, which are, in a business model sense, too high at the moment, given rational expectations about future capacity.


One example is the sheer amount of new spectrum that is coming in the millimeter bands, in the 5G era. All spectrum now available for mobile operators to use amounts to about 600 MHz to perhaps 800 MHz of licensed spectrum.

But orders of magnitude more spectrum will be allocated and used in the regions above 3 GHz, as 5G becomes a reality.

In substantial part, the ability to use millimeter wave frequencies explains why abundance is coming. But the ability to share existing spectrum in already-licensed bands, and additional license-exempt spectrum, plus much more sophisticated signal processing and radio technologies, plus use of smaller cells, all will contribute to growing conditions of abundance.

As scarcity was the foundation of every monopoly-era business model, and limited competition has been the reality of the competitive era, radical competition could be the reality of the coming 5G and post-5G eras.


Where business models were based on scarcity, they will be built on abundance, in the future.

AI Wiill Indeed Wreck Havoc in Some Industries

Creative workers are right to worry about the impact of artificial intelligence on jobs within the industry, just as creative workers were r...