Wednesday, April 4, 2018

Some Consumer IoT Apps Will Generate $1 a Year in Revenue

Hopes for internet of things connectivity revenue face a daunting constraint, namely that per-year revenue can be quite low for some categories of consumer sensors such as home smoke alarms. Some existing retail tariffs work out to about $1.20 a year revenue per sensor.

So volume will be quite important. Moreover, the actual revenue from sensor connectivity alone will be relatively low, compared to the value of IoT platforms and apps.

The industrial sensor market predates our present notion of “internet of things.” In consumer markets, government mandates have driven much of the water meter or electricity meter demand. Industrial firms, on the other hand, long have used sensors to measure pressure, temperature and flow.

Those have been niche markets in the past. What is new is the expansion of sensors in many consumer-facing and consumer-used applications, as well as some new business markets such as sensors for parking, transportation, medical care and so forth.

Newer apps, such as smart home apps or wearables, are expected growth areas in the consumer space.


A survey on ON World of nearly 100 low power wide area network operators found that unlicensed networks such as Sigfox and LoRa support 66 percent of the platforms in use today.

Sigfox and LoRa networks cover much of Europe and many parts of Asia Pacific with IoT services offered by telecom operators such as Arqiva, Bouygues, Orange, KPN, Proximus, Swisscom as well as a growing number of IoT independent operators such as Senet, Thinxtra and UnaBiz.

Licensed LPWA networks such as LTE-M and NB-IoT are growing even faster than unlicensed networks and make up nearly 33 percent of the LPWA operators surveyed.  NB-IoT network operator activity has accelerated over the past year and will grow 1800 percent in 20128.

One important observation is that IoT connections do not create much new connectivity revenue. Licensed IoT network operators such as Deutsche Telekom are selling €10 (USD $12) service plans  that support a single sensor for 10 years, using 500 MB of data. That obviously means incremental annual revenue is about $1.20.

Mobile operators such as Deutsche Telekom, China Telecom and Vodafone are aggressively rolling out their licensed NB-IoT networks in Europe, China and Australia.  In the U.S., all major operators including AT&T, Verizon, Sprint and T-Mobile are building licensed IoT networks and most of these are providing both LTE-M and NB-IoT services.  

By the end of 2018, most of the U.S., Europe and Asia Pacific will be covered with licensed IoT networks, ON World predicts.

Tuesday, April 3, 2018

U.S. Telcos Have Lost 87% of Voice Accounts in 18 Years

Mobile substitution is about to become a much-bigger commercial and policy issue, in large part because, for the first time, mobile substitutes for fixed network internet access are going to be commercial realities, sold by tier-one service providers with the marketing muscle to drive adoption.


History provides some idea of how much could change.


Once upon a time (in the 1970s and 1980s), it was believed the solution to the problem of universal communications had to be based on supplying connections to fixed networks. The great cost of building such networks in developing nations was therefore a great cause of concern.


Technology saved us. With the advent of mobile networks, we have nearly solved the problem of voice communications, globally. But there always are consequences for legacy products when new products displace them.


Over a span of less than two decades, voice services, the traditional telecom revenue driver, has virtually collapsed, in some markets. In the U.S. market, for example, line loss for telcos has been as much as 87 percent, between 2000 and 2018.


Ironically, the Telecommunications Act of 1996, the first major revision of U.S. telecommunications law since 1934, focused on enabling voice communications and ownership of voice switches. The emergence of the internet, and mobile substitution, seemingly had not occurred to lawmakers.


The point is that commercial business strategy and government regulation, no matter how thoughtfully considered in the moment, can fail to achieve its intended objectives because markets (both supply and demand) now change so rapidly.




Beginning with the 5G era, we are likely to find that technology once again solves an qually big challenge, namely supplying internet access to everyone. Nor is 5G the only important platform. We might well see that new constellations of low earth orbit satellites, and perhaps fleets of balloons or orbiting unmanned aerial vehicles, plus use of unlicensed and shared spectrum access platforms.


There are going to be winners and losers. It is not hard to predict that the business value of a fixed network will change. Such networks are going to drive far less revenue than in the past.


And that has implications for the amount of capital that can be invested in the business, where that capital is invested and how much innovation can be expected. Some of us would argue that enterprise revenue sources will become more important, consumer sources less important.


In the 5G era, it is possible that mobile networks and platforms will be able to match fixed network performance. On top of that, low earth orbit satellite constellations, or even Google’s Loon service, based on use of balloons, might be in place to provide rural internet access across the rural United States.


Even if some worry that competition in internet access is endangered, some of us would argue competition is destined to increase.

Can Independent ISPs Get 50% Market Share?

Can independent internet service providers (public or private) actually get 50-percent market share when competing against telcos and cable companies? Ting Internet believes so, but results from other firms suggest the level of competitor pricing really does matter. G

It always is difficult to quantify take rates for gigabit internet access services, as virtually no internet service provider ever releases such figures. That has been the case since faster internet access services, priced at a market premium, began appearing. The reason for the reporting reticence is that take rates for the fastest, most expensive tier of service tend to be minimal.

Still, ISPs do tout some figures. Mediacom, for example, claims that between 10 percent to 20 percent of its new accounts are buying  gigabit services costing between $125 a month to $140 a month. Again, it is hard to quantify what that means.

The actual number or percentage of account that change providers every year (churn) in the fixed network internet access business arguably varies between markets with strong offers (where fiber to home is sold, and where gigabit cable services also are sold).

Churn arguably is highest where a cable operator offering speeds in the hundreds of megabits per second competes with a telco only able to sell slower digital  subscriber line service. AT&T and Verizon, for example, tend to see low churn rates, where other independent telcos with less fiber to home tend to see higher churn rates.

Much obviously depends on pricing levels. In markets where a gigabit ISP sells 1,000-Mbps service at prices that match the current or legacy prices for slower service (perhaps 100 Mbps), take rates can climb dramattically.

What is harder to model are markets where a clear price premium exists. It will matter when a reasonably-fast standard offer costs $50 a month and the gigabit offer costs more than $125 a month.

Presumably, in such market, demand will be anchored by business demand, higher-income households and multi-user households.

Perhaps the most-optimistic provider to make public predications is Ting Internet, which tends to argue it will get adoption as high as 50 percent in any new market it launches, within five years or so. Initial take rates when first marketed appear to be about 20 percent.

In its fourth quarter 2017 report, Tucows reported Ting Internet take rates of 30 percent take rates in areas where it is able to market its gigabit service.

Ting prices its gigabit service at $90 a month. At that price, it is lower than Comcast and other cable companies charge, but higher than the $70 a month some other ISPs offer. The point is that Ting is pricing at a significant price premium to “standard” offers that offer less value (in terms of speed), but not pricing as high as cable companies or telco practice.

We are likely to see much more of this sort of independent ISP competition in the fixed market, not to mention 5G-based gigabit offers from mobile suppliers.

At least in principle, more than 100 Colorado communities could see some form of
municipal broadband network created, as voters in those communities have approved such moves.

Longmont, Colo. already has built out a portion of its planned gigabit internet access network, aided by that city’s ownership of a municipal power utility, meaning Longmont owns rights of way, distribution facilities, rolling stock and other assets helpful to creating a city-wide internet access network.

In Centennial, Colo., private internet service provider Ting Internet will piggyback on a new government network to be built by the city of Centennial itself.

Also, it sometimes is difficult to ascertain precisely what take rates are, since many independent ISPs challenging cable or telco suppliers seem to count “units sold” rather than “households served.”

That matters when an ISP sells two or more products, and then calculates adoption rates as “units sold divided by households passed.”

In other words, penetration is measured in terms of revenue-generating units, not “locations” or “households.” Each unit sold (voice, video or internet access) is counted against the base of locations. So a single location buying three services results counts as much as three other homes buying just one service.


Customer “penetration” by household therefore is different from penetration measured as a function of units sold. The difference is that determining the magnitude of stranded assets hinges on how many locations passed generate revenue.

Assume that, on average, a typical household buys 66 percent of the total suite of services (two of three triple play services or  three of five services, for example).

The difference is significant. Measuring “penetration” by units sold, penetration appears to be as high as 76 percent to 87 percent. Measured as a function of homes generating revenue, penetration could be as low as nine percent, or as high as 44 percent, with a “typical” value being something between 20 percent to 25 percent of homes passed.

Penetration: Units Sold or Homes Buying Service?

Morristown
Chattanooga
Bristol
Cedar Falls
Longmont
homes passed
14500
140000
16800
15000
4000
subscribers
5600
70000
12700
13000
500
units sold
39%
50%
76%
87%
13%
services sold
3
3
5
3
2
HH buys .66 =
2
2
3
2
1
Homes served
2828
35354
3848
6566
379
penetration
20%
25%
23%
44%
9%

It might be worth pointing out that all these communities (Morristown, Chattanooga, Bristol, Cedar Falls and Longmont) have municipally-owned utility companies, and might therefore represent a sort of best case for retail operations serving consumers.

That seems consistent with other evidence. In markets where a telco and a cable operator are competent, as is the attacking ISP (municipal or private), market share might take a structure of 40-40-20 or so, possibly 50-30-20 in areas where the telco does not have the ability to invest in faster broadband and the cable operator has the largest share.

Beyond the actual cost of the network, and the business role chosen by the municipality, details of revenue generation (homes that generate revenue as a percentage of total; number of services offered) are fundamental.

Beyond that are the other operating and marketing costs, overhead and need for repaying borrowed funds and making interest payments, on the part of the retail service provider.

One might argue that most other communities, without the advantages ownership of an electric utility provides, will often find the lower risk of a shared public-private approach more appealing.

Also, some ISPs might find the availability of some amount of wholesale or shared infrastructure makes a meaningful difference in a business model.

One might suggest there are a couple of potential practical implications. Efforts by incumbent ISPs to raise retail prices in the same way that video entertainment prices have grown (far higher than the rate of overall inflation) will increase the odds new competitors enter a market.

Higher prices, in fact, will increase the likelihood of new entrants entering a market, as the higher prices increase the attractiveness of doing so.

In at least some cases, the new competitors will be firms such as Verizon, which now has announced it will essentially overbuild an AT&T and Comcast markets in Sacramento, Calif.

Though it is not easy, more competitive ISPs are likely to enter more markets, as lower-cost access platforms evolve, helped in some cases by municipal facilities support.

Where that happens, it is conceivable that the incumbents will see a new limitation on their market share, dipping from possibly 50-percent share to a maximum of perhaps 40 percent each, on a long-term basis, assuming the new competitor is not eventually bought out by one of the incumbents.

Monday, April 2, 2018

If the Internet is Changing, Backward-Looking Regulation Maybe Not So Smart

It is a truism that regulators never can keep up with technology, which is a reasonable argument for caution where it comes to regulating just about anything central to the  internet, which keeps changing.

The logical implication might be that the historic light-touch treatment of anything related to computing likely should be the presumption, even if consumer protection or antitrust remedies might from time to time be deemed necessary.

The reason for such caution is that computing and the internet itself continue to change. And if there is anything history shows, it is that the leaders in any single era rarely--if ever--are the leaders in the succeeding era. So “punishing” a leader in one era, when an era is about to evolve, does not make too much long-term sense.

We might today believe it is nonsensical to consider the web browser a source of competitive advantage, as though the browser provides some sort of business moat. Nevertheless, not so long ago, the browser was considered a source of anti-competitive advantage.

Not so long ago, ability to offer phone services using Class 5 switches was considered to be a major pro-competitive step. Before that, long distance voice was the profit driver for the telecom industry. Things change, and change quite a lot.

Leadership in the mainframe computing era was not the same as leadership in the mini-computer or personal computing eras. And none of those leaders were key in the early internet age.

Someday, Google, Facebook and others might not be the leaders in some future evolution of the internet ecosystem. The point is that regulation is sometimes pointless when it is backward looking.

The internet has grown through multiple eras, any observer can note. Where the internet was a low-bandwidth method for researchers to communicate, it then became a consumer transaction and information platform, before becoming a social media vehicle, and now a content delivery mechanism. Next, the internet is on the cusp of possibly becoming a major platform for enterprise private networks.

With the coming internet of things evolution, autonomous vehicles and sensor networks of many types could represent the next big wave of development for the internet, as the internet has defined the most-recent eras of computing.



Saturday, March 31, 2018

5G is Like the Tail on a Dog

5G is to networking as telecom is to the internet. That is to say, 5G is part of a larger shift of networking as "telecom" has become a tail on the internet dog.

Specifically, 5G is part of a larger transformation of global public networks involving much lower latency, much more virtualization and new roles for data centers. Those features, in turn, are required to lower the cost of running networks as well as create the foundation for new categories of services that will drive incremental revenue at scale.

Extremely low latency, high connection density, high reliability and gigabit speeds are driving the design of the whole new architecture, with implications for access, cloud computing and virtualization.

Virtualization is a key change, with separation of control and signaling functions from delivery of end user traffic becoming key. Lower cost is among the expected outcomes. Greater flexibility also is anticipated.

What might be relatively unexpected is that virtualization will increase the ability to create new virtual networks on a wider scale. That could have key implications for new suppliers, including suppliers that only want to create large virtual WANs to support internal
business requirements.

The analogy here is the shift of wide area networking from “mostly public” to “mostly private.” In other words, on many routes, enterprises carry most of the traffic, but only to support their own internal operations. Major app providers, in other words, have vertically integrated WAN services.

That has business model implications both for the enterprises and suppliers of WAN services.

Also, to support new ultra-low-latency apps, data center functions will emerge at the edge of the network. In that sense, 5G and edge computing are parts of a larger trend, the creation of next-generation networks that are both virtualized and built around latency performance.

That stringent latency performance will be needed to support emerging new applications in the connected car, internet of things and real-time application areas. Bandwidth will matter, but latency arguably is the key new capability.   

Friday, March 30, 2018

The Prejudice Against Bigness

It would be reasonable enough to argue that there is an almost-instinctive distrust of "bigness" in business, as such bigness is presumed to be responsible for destroying the fortunes of small and local businesses.

It must also be said that it is consumers who propel the rise of "big businesses," since it is consumers who prefer the products, prices or other attributes of bigger businesses, over those of smaller suppliers.

We might be tempted to decry bigness, but consumer choice is what produces bigness. And that means bigness is not always and inevitably bad.

Sometimes, in fact, bigness might be a survival requirement. Consider some foundations of business in the internet era. Among the key trends are a few that define internet market dynamics:

* Pricing and margin pressure
* New competitors from outside the existing value chain
* Winner take all market share dynamics
* Disintermediation of distributors in the value chain

In case you somehow missed the trend, the internet leads to lower retail prices and lower supplier profits, as price transparency increases and distributors are removed from the value chain. 

Faced with lower retail prices, suppliers can respond in a few ways. They can sell more units at the lower prices, which requires more scale. Firms can sell different products, which means moving into new or different parts of the value chain. And firms can take on additional roles within value chains, increasing the ways they make revenue. 

The point is that all that tends to imply firms get bigger. 

Also, the internet allows firms outside the existing industries, possibly operating with different value and revenue models, to enter existing businesses and disrupt them. That creates additional pressure on sales volume, prices and profits. 

Market disruptors also frequently rely on internet capabilities to remove cost from the value chain by eliminating steps and roles in the distribution process, also after changing costs in the production part of the value chain. 

The "winner take all" nature of many internet-reliant businesses and industries also contributes to bigness. 

So we need to be careful not to confuse bigness, which might well be a requirement for firm or industry survival, with actual anti-competitive behavior on the part of big firms. Abuse always is a possibility when power exists in a market. But abuse is not automatic, and hopefully not a routine outcome.

Nor should we automatically penalize firms for delivering higher quality products at lower prices. That is the outcome we are supposed to gain from competition. 

But sometimes competition can only be fostered by big firms. Break up the big firms and you also possibly destroy ability to innovate and create even more consumer benefits. 

Bigness is not inherently bad. 


AI Will Improve Productivity, But That is Not the Biggest Possible Change

Many would note that the internet impact on content media has been profound, boosting social and online media at the expense of linear form...