Tuesday, May 30, 2017

Spectrum Abundance Might Change Everything

Business models in the telecom business have tended to change slowly. But everyone likely would agree that the pace of change now is much faster. And the biggest single assumption about telecom business models has been scarcity.

In the past, there was--by law--no competition allowed. That fundamental assumption underpinned the whole business model. But we no longer believe telecom is a "natural monopoly" (just one supplier is sustainable).

Most would agree it probably is an oligopoly (a few providers). But even that assumption is going to be challenged over the coming decade, as spectrum scarcity becomes relative or absolute spectrum abundance.

The global mobile business, like its predecessor fixed line networks business, was built on scarcity. Monopoly regulation created scarcity by policy decision, allowing only one firm to lawfully provide telecom services in a country. The whole business model therefore was shaped by the deliberate lack of competition.

But there are other forms of scarcity. Even in the competitive era, fixed networks are expensive, capital-intensive undertakings that necessarily limit the number of sustainable providers (some would still say the empirical evidence is that some markets can support only one facilities-based provider; two providers in some markets and possibly three contestants in parts of national markets.

In mobile markets it presently seems possible to support more than one facilities-based provider in every market, though observers disagree about the number of indefinitely-sustainable contestants owning their own facilities.

But mobile network business models also have been built on scarcity of the policy sort, namely by the reliance on licensed rights to spectrum. As virtually anybody would acknowledge, traditionally, such spectrum has been valuable because it has been scarce.

The issue now is whether “scarcity” conditions will continue to define most business models.
And that is open to question.

“Since the 1920s, regulators have assumed that new transmitters will interfere with other uses of the radio spectrum, leading to the ‘doctrine of spectrum scarcity,’” said IEEE Spectrum authors Gregory Staple and Kevin Werbach, over a decade ago.

What would change? Spectrum sharing and, to a lesser extent, new spectrum, it was believed. What is different now, more than a decade later, is that ability to use vast amounts of new spectrum in the millimeter wave region, something most would have thought either impossible or unlikely, in the past.

Even though industry executives and regulators “always” have considered spectrum a scarce resource, that is “not so.” Rights to use spectrum has been the scarce ingredient, a major assumption upon which the business model is built.

Also, the traditional reason for such licensing was to prevent signal interference. But “interference” is a function of device and transmitter performance, not simply the number of simultaneous users. Moore’s Law advances mean that signal processing capabilities are far more sophisticated than was possible in the analog or even earlier digital realms.

There are two huge implications. First, the spectrum portfolios of large cellular phone companies will certainly be devalued. Second, scarcity will not provide a “business moat” around suppliers, as it once did. New competitors will be able to enter the business, simply because the barrier of having “rights to use spectrum” is falling.


The initial signs are that coming spectrum abundance already is having an impact on spectrum prices, which are, in a business model sense, too high at the moment, given rational expectations about future capacity.


One example is the sheer amount of new spectrum that is coming in the millimeter bands, in the 5G era. All spectrum now available for mobile operators to use amounts to about 600 MHz to perhaps 800 MHz of licensed spectrum.

But orders of magnitude more spectrum will be allocated and used in the regions above 3 GHz, as 5G becomes a reality.

In substantial part, the ability to use millimeter wave frequencies explains why abundance is coming. But the ability to share existing spectrum in already-licensed bands, and additional license-exempt spectrum, plus much more sophisticated signal processing and radio technologies, plus use of smaller cells, all will contribute to growing conditions of abundance.

As scarcity was the foundation of every monopoly-era business model, and limited competition has been the reality of the competitive era, radical competition could be the reality of the coming 5G and post-5G eras.


Where business models were based on scarcity, they will be built on abundance, in the future.

Monday, May 29, 2017

Will 5G Drive Massive Cord Cutting?

Even if most of the attention in popular media is about “cord cutting” related to linear video subscriptions, internet access  cord cutting arguably is an equally-big potential issue.

According to Magid’s 2017 Mobile Lifestyle Study, an annual study of 2,500 U.S. consumers that focuses on emerging trends in the mobile market, more than three in ten smartphone owners expressed interest in cancelling their home broadband service in favor of just having 5G. Interest was particularly high among Millennials, exceeding 40 percent.

That is quite a lot higher than the 10 percent of consumers who reported they would consider mobile substitution for internet access found by some other studies. A study by Parks Associates suggests that perhaps 10 percent of U.S. broadband households are “likely to cancel their fixed broadband service” over the next 12 months, and use wireless or mobile data services as a replacement.

Some would argue that coming 5G networks could be a huge catalyst in that regard, dramatically changing the value-price relationship for mobile internet access in ways that make it a more direct substitute for fixed access.

So far, the cost of mobile internet access has been so much higher than the cost of fixed network internet access that mobile substitution has made sense in only limited circumstances.


Where mobile data might cost about $9 to $10 per gigabyte (GB) in the U.S. market, for example, fixed access might cost as little as 15 cents per GB. In other words, mobile data costs about an order of magnitude more, on a per-gigabyte basis, as does fixed network access.

But 5G could be a game changer, allowing mobile operators to credibly match fixed network access speeds and per-GB prices.


Saturday, May 27, 2017

Are Price Wars, Speed Upgrades Dramatically Changing Customer Satisfaction? It Appears So

One way of interpreting the latest consumer satisfaction rankings from American Customer Satisfaction Index (ACSI) is that, in general, consumer satisfaction with network-delivered services (mobile, fixed) increases as prices drop, and drops as prices are increased.

The caveat is that “price” likely has to be viewed in relationship to value. At least theoretically, consumer satisfaction could remain the same, or climb, if value increases even more than price, on some customer-perceived key dimension.

Xfinity (Comcast) subscription TV satisfaction scores dropped six percent to 58. That is a big change, for any industry and any company over a year’s time, perhaps more notably given the introduction of the voice-activated Xfinity X1 TV interface, which arguably makes it far easier for consumers to find something to watch.

Some might also attribute higher apparent satisfaction with lower prices in the mobile service industry.  

Customer satisfaction with mobile service climbed nearly three percent to 73 as price wars between carriers escalated, ACSI notes. Again, a three-percent change in a single year is substantial for ACSI rankings.

Compared with other telecom categories where customer choice is limited, the mobile  industry arguably is far more competitive. Prices are competitive, service is better and customer satisfaction higher, ACSI says.

For the linear subscription TV industry as a whole, customer satisfaction was down 1.5 percent to 64, tied with internet service providers for last place among 43 industries tracked by the ACSI.

Some firms did better than others, though.

Fios (Verizon Communications) climbed one point to 71), as AT&TU-verse climbed one point  to 70). Suddenlink, part of Altice USA, improved two percent to 63.

The biggest jump was registered by Spectrum (Charter Communications), up five points to 63, in part because its acquired properties Bright House Networks scored higher than legacy Charter.

Cox Communications improved two percent to 60.

Mobile service providers were uniformly higher. TracFone Wireless improved three percent to 77.

Verizon Wireless gained four percent; U.S. Cellular gained three points to reach 74. Sprint rose four percent to 73. AT&T gained one point, to reach a score of 72.

But T-Mobile US dropped one point to 73. As always, a one-point drop or gain is not unusual, for any industry in any given year, unless the direction continues in a succeeding year, one might argue.

The overall point is that a major price war seems to be resulting in higher satisfaction in the mobile service category.

Internet service providers remain unchanged at the bottom of the ACSI industry rankings at a score of 64. Low user satisfaction is the result of slow and unreliable service, compounded by limited competition, ACSI says.

Verizon’s  Fios stays at the top of the category, but also declined three percent to 71. AT&T’s U-verse gained a whopping eight percent, something that is historically highly unusual. AT&T reached a score of 69.

Suddenlink climbed eight percent to 66, likely the result of boosting access speeds and thereby changing the value perception.

Charter’s Time Warner Cable dropped six percent. Comcast’s Xfinity improved two percent to 60,  a four-year high for the company, ACSI says. CenturyLink satisfaction declined six percent to 59.

In the internet access area, value in relationship to price seems to be the story. The firms making the highly-unusual jumps (AT&T, Suddenlink) are dramatically boosting speeds. The firm making highly-unusual declines (CenturyLink) seems unable to compete with cable speeds. So it might be absolute price, but price in relationship to value, which is the problem, or the advantage.

.source: ACSI

What is the Point of Heavily Regulating a Declining Product?

As a rule, and with the caveat that other points of view exist, I tend to believe that increasing regulation of declining services, industries and products is ultimately pointless, and a waste of time and resources. When it is clear that a product is declining, and being replaced by one or more new substitutes, it just makes sense to allow as graceful a decline as possible.

Since the best outcome is a graceful harvesting of remaining revenues, as demand keeps shifting to the replacement products, it does not make sense to increase, and in some cases, even to maintain, high levels of regulation. Instead, it likely makes sense to allow customers to choose, and suppliers to market, whatever services they want, with less-burdensome overhead, wherever possible.

That would seem to be the case for the U.S. business data services market.  

The business data market has been shifting from legacy SONET/SDH to Ethernet for quite some time. By some estimates, SONET/SDH new hardware sales, for example, represent about $2 billion in annual sales, while optical and IP hardware sales were likely an order of magnitude higher by 2015, according to Cisco.

Packet optical and wave division multiplexing equipment sales have been growing since at least 2010 at the expense of SONET/SDH, according to Heavy Reading. Supply also is increasing as well.

After twelve years of study, multiple rounds of comments, and the most extensive data collection ever conducted by the Commission, the FCC concluded that there is “substantial and growing competition” in the “dynamic” marketplace for BDS in the geographic areas of price cap carriers, US Telecom notes.





Such trends are why US Telecom now argues the Federal Communications Commission should be deregulating business data service where competition now exists.

Thursday, May 25, 2017

So Far, 26% of IoT Initiatives Succeed

Some observers might be surprised that 26 percent of surveyed enterprises report they were successful with their internet of things initiatives. That should not come as a surprise.

For decades, success rates for information technology initiatives have generally failed more often than they succeeded. By some estimates, only about nine percent of software development projects at large firms are successful.

The same might be said of change initiatives generally, which some say tend to fail about 75 percent of the time.  

The Cisco study also found that information technology executives were quite a bit more convinced the projects were successful, compared to business executives who were more concerned with business cases.

So far, the Cisco study suggests, IoT initiatives are faring about as well as other information technology initiatives tend to, at larger enterprises.


source: Cisco

Who Pays for IoT Communications?

One pesky and important detail we have yet to fully work out is the business model for IoT appliance business models, for consumer appliances.

If you assume a world where nearly every in-home consumer appliance, and probably lots of other stand-alone sensors to track everything from motion to soil moisture to light levels, often on a “stick it on” basis (put a tracking sensor anywhere you want by peeling off the adhesive), and if you assume connectivity has to be provided, the issue is how that connectivity is supplied, and “who” pays for it.

Amazon provides one model, where the appliance supplier pays for connectivity (mobile network) if Wi-Fi is not available, and then uses Wi-Fi as the preferred connection. In that model, connectivity somehow is build into the use of an appliance (does a purchase become a rental?).

Wi-Fi might be an easy choice, as it shifts payment to the owner of the appliance (user pays for the internet access connection). In a few cases perhaps a third party pays (advertising).

That same model could hold for multi-device IoT plans sold by mobile operators, just as they now sell “multiple-device” plans. That has the user paying.

There are exotic possibilities, such as collaboration between a refrigerator manufacturer and one or more large grocers, where an appliance maker works with the retailer and gets a percentage of automated grocery orders. Those might be relatively complex deals for almost anybody but an Amazon.
  



No End in Sight for Margin Compression or Revenue Shrinkage?

When observers say the “cost” of supplying telecom services is “too high,” and must be made more affordable, the obvious and direct implication is that somewhere in the supply ecosystem, some participants are going to see a reduction in value and revenue, allowing the final end product--internet access--to be provided to “everyone,” at prices they can afford.

As one example, “open source” network elements already have been developed by the Telecom Infrastructure Project (TIP), a consortium led by Facebook to develop open source transmission products that, in turn, reduce the cost of building and operating transmission networks.

Voyager, a long-haul optical transmission system, already been tested by Facebook and European telecom company Telia over Telia’s thousand-kilometer-telecom network. ADVA Optical Networking is manufacturing the device, which also is being tested by other carriers.

By definition, open source  telecom technology is designed to lower networking costs, which means it will shrink the size of telecom equipment markets. Of course, the buyers of such gear--the communications companies--want lower-cost gear and platforms. And some equipment suppliers see an opportunity to disrupt current market leaders and seize a larger role for themselves.

TIP’s OpenCellular project likewise is working on an open source 4G LTE/LTE base station, the hardware and the software.   

So here’s the point: the global telecom industry now is said to be a $350 billion market, including software, hardware and services.

Ericsson and Huawei, among other leading suppliers, have not joined, though Nokia, Cisco, Juniper and others have joined.

The point is that, whatever you think internet access presently costs, it is going to cost less in the future. Of course, there is a corollary. Some participants in the telecom ecosystem are going to represent less value, and earn less revenue, than before. In most segments of the ecosystem other than the application portions of the business, that has been the case for some decades.

The total value of the internet value chain has almost trebled from $1.2 trillion in 2008 to almost $3.5 trillion in 2015, a compound annual growth rate of 16 percent, according to A.T. Kearney estimates published by the GSMA. About 17 percent of that total value is captured by connectivity providers of all types.

Many would argue it is possible, perhaps likely, that that percentage will shrink over the next decade or two. Margin compression is a problem all too familiar in the service provider and platform supplier parts of the networking value chain. But some problems are worse, especially gross revenue shrinkage. The problem is that margin compression virtually always is a sign of shrinking value and gross revenue contraction.

source: GSMA

DIY and Licensed GenAI Patterns Will Continue

As always with software, firms are going to opt for a mix of "do it yourself" owned technology and licensed third party offerings....