Sunday, July 3, 2016

How Long Before IoT Reaches 10% Adoption in Most Markets?

Experience (some might say “history”) is a highly-underrated analytical tool, even if most of us have only a few decades of experience in any single industry or industry segment to draw upon.

Experience would eventually impress upon you that few important new technologies ever develop as fast as observers predict. But for truly important technologies, that lagging adoption in the early days is later matched by adoption that exceeds forecasts. In other words, adoption tends to be non-linear.

All that can be lost when time frames are too compressed: then every innovation seems to have a linear and “rapid adoption” curve. But “decades” to a “couple of decades” and sometimes even “a few decades” is the right timeframe for significant adoption of some ideas and technologies.

Some might predict that the “Internet of Things,” even in the most-advanced industrial segments or vertical application classes, might well take two decades to reach significant adoption, assuming the turn of the century is when the phrase “Internet of Things” happened.

Even that might be too optimistic, as some of us will recall talk of connected vending machines in the 1980s. By that measure we are in the fourth decades of conceptual thinking about what we would now call an IoT application for vending machines.


The more complex the ecosystem, the longer it takes. Device adoption tends to happen faster: it is a “simple” matter of large numbers of people buying a tool. When attitudes need to change, and trust established, a decade can pass before 10 percent of people will adopt an important new technology. Use of debit cards and automated teller machines had that character, for example.

Kevin Ashton, many suggest, coined  the phrase Internet of Things in 1999.

The basic concept remains the same: ‘If we had computers that knew everything there was to know about things, using data they gathered without any help from us, we would be able to track and count everything, greatly reducing waste, loss and cost,” he said. “We would know when things needed replacing, repairing or recalling, and whether they were fresh or past their best.”

“In the twentieth century, computers were brains without senses: they only knew what we told them,” Ashton said. “In the twenty-first century, because of the Internet of Things, computers can sense things for themselves.”

The point, should it bear repeating, is that major and successful innovations often take quite a long time to move from conception to mass adoption. As much attention as now gets paid to IoT, we are 16 years out from inception.

Many predict quite substantial diffusion by 2025. That would mean a quarter century from “idea” to “widespread commercial adoption.”

That is not unusual, even if we often point to the rapid adoption of new and fundamental technologies ranging from use of personal computers to use of the Internet.

Consider the automated teller machine, one example of a useful and now ubiquitous technology routinely used by consumers.

ATM card adoption provides one example, where "decades" is a reasonable way of describing adoption of some new technologies, even those that arguably are quite useful.

Debit cards provide another example. It can take two decades for adoption to reach half of U.S. households, for example.  

IoT represents a very-complicated ecosystem, with the sustainable business model being among the developments required to propel further development. Yes, hardware and software development is required. But the speed of that development is propelled by creation of viable business models to support actors making big capital investments to satisfy demand.

Many point out that traffic and parking are the sorts of problems IoT can help solve. All true. The issue is whether--and how fast--business models can develop to fully fund the deployment of the extensive networks and devices (including automobiles) able to take advantage of IoT-enabled transportation and vehicle parking.


All of that likely means that IoT adoption by even 10 percent of actors in an application universe will take longer than most believe. Experience is the teacher, in that regard.

Saturday, July 2, 2016

Why IoT Matters for Consumer Internet Access

The connection between the industrial Internet of Things and widespread consumer Internet access is not always crystal clear.

But there is a direct connection: all ubiquitous networks eventually wind up being supported by a range of revenue sources from both consumer and business customer segments.


Historically, low consumer communications prices were possible only because high-profit business services were used to subsidize the consumer services which sometimes lost money and sometimes only were profit neutral.


That is less true than in the monopoly era, but Verizon, in 2014, still generated more than half its total fixed network revenues from business customers (wholesale, enterprise, small business).


Something like that is likely to develop with industrial Internet of Things or enterprise IoT services as well: they will generate both revenue and profit margin to feed the one network used by all customers, business or consumer.


In a sort of similar way, each new Internet facilities domain tends to improve the value of the whole interconnected set of networks. Such network effects mean the value of the whole network grows with the volume of end points or connections.


The point is that IoT revenues matter because they can provide enough gross revenue and profit margin to help fund upgrades and operation of the whole network. And anything the improves the sustainability of the network as a whole makes possible services for consumers, especially the more cost-conscious consumers who might use the network.


Cable TV providers likely still earn most of their revenue from consumer segments. But for telcos, business customers arguably drive more business revenue, and telcos earn higher margins on business customer accounts.


That is likely to be the case for IoT as well.


source: Deloitte University Press

Friday, July 1, 2016

Zero Rating is Good for Consumers, Google, Media and Communications Companies.

Some now say Google is essentially dictating Internet policy in the United States and European Community. One does not have to make such claims to note that it is very hard (virtually impossible) to find any Internet-related issue, on either continent, where Google’s position has lost a regulatory battle.

That is not to belittle in any way Google’s advocacy of its own business interests. But the simple matter is that no firm spends serious money on regulatory matters without a clear understanding of its own economic interests.

And one is hard pressed to find any significant issue where Google has not gotten its way. That is not nefarious. But some of us would argue that Google’s very success shows why it could be quite unwise to impose more regulation on the industries Google is beating in the policy battles, namely Internet service providers of several types.

The very large point is that it does not make sense to place new burdens or restrictions on industries that are in decline.

On the other hand, it likewise makes sense to encourage the growth of new industries by likewise minimizing barriers to their growth. A regulatory “light touch,” in that sense, makes sense both for the industries policymakers might wish to encourage, as well as industries with large societal and economic impact that are virtually destined to be smaller in the future.

The issue of bans on zero rating provides an example. Ironically, Google is able to provide valuable services of so many types, at zero incremental cost to end users, precisely because third parties (advertisers) subsidize the services and apps on behalf of consumers.

In other words, Google literally has built its business on zero rating, but wants ISPs banned from any similar practices, even if zero rating has been commonplace in the Internet, media, content and communications businesses for decades.

“Zero rating is nothing more than price flexibility,” argues consultant John Strand. “ It is a business model enjoyed across every industry and even in the Internet itself.”

“Every Google search is zero rated by an advertisement,” Strand notes. “That is a third party subsidizes the cost so the end user does not have to pay.”

That is no argument for placing limitations on Google’s ability to innovate, in business models as well as other key elements of its growing stable of businesses. Google should be as free as possible to innovate.

But ISPs also need more ability to innovate, as their core revenue models are disappearing. Zero rating is what toll free calling has been about. Zero rating is what advertising-supported content and media have been all about. Fairness, as well as necessity, requires that the present lawfulness of zero rating be maintained, many would argue.

Messaging Morphs into a Commerce, Payments Platform

Over the last decade, we have become accustomed to the notion that carrier text messaging has other product substitutes, namely over the top messaging services.

So we have seen traditional text messaging revenues decline, while a former revenue source shrinks, and in many ways is coming a feature of mobile service, bundled with use of the network, but not necessarily a direct revenue driver.

But something else also is happening: messaging platforms are becoming the foundation for mobile payment and e-commerce services.

The obvious response, on the part of a carrier, is to investigate whether carrier text messaging can move in such directions, and change, as well. Skepticism would not be unwarranted, however, as either use of messaging as a platform for payments, mobile banking and e-commerce transactions builds heavily on other assets.

Customer base helps, but knowledge of each potential consumer’s values and behavior, social sets and existing interests, is what really drives value in such evoutions of messaging to commerce and other transaction services.

source: Mary Meeker, Internet Trends 2016

 

Wednesday, June 29, 2016

Special Access is a Prime Example of Ecosystem Tension

source: IHS
For as long as I can remember, special access rates, conditions and terms of service have been contentious, for obvious reasons: there are sellers and there are buyers . The former want more freedom, the latter want less. The sellers are opposed to price controls, the buyers want them.

What might be sort of interesting in the latest round of discussions about regulating of special access is that Verizon, to a certain extent, is more concerned about being a buyer than a seller, as is Sprint, and virtually all competitive local exchange carriers.

There are some obvious nuances. Some might question the wisdom of prolonging a legacy service, in decline, when IP substitutes could be made available, and when a shift to Ethernet-based access is under way, in any case.

To a large extent, technology shifts are at issue here. Special access traditionally has been a set of products (T1, DS3) using specific protocols. Ethernet is the next generation protocol that is displacing the older forms of special access.

And among the issues is the degree of choice buyers have in the Ethernet access market, as distinct from the legacy T1, DS3 markets. Other issues, such as competitive dynamics, are in play as well.

Traditionally, cable operators, now the dominant suppliers of data access in the U.S. consumer Internet access market, can sell Ethernet connections, not just telcos. And then there are new competitors as well (Google Fiber, other independent ISPs).

In fact, many who sell access to smaller business accounts would note that sales of T1 lines for business phone systems are declining, in favor of other options such as SIP trunking, while Ethernet makes most sense for data connections.

“It's important to note that DS1 and DS3 lines are legacy products that are at the end of their technological life,” says consultant Roger Entner.  Still, the direction is towards Ethernet.

Network element and transmission sales, of course, are different than backhaul services purchased, but the trend is clear enough: fiber and wireless are growing, copper special access is shrinking.

In 2012, Ethernet based mobile backhaul accounted for 40 percent  of global physical connections of all types of mobile/cellular backhaul (e.g., TDM, IP). Ethernet was  projected to be 95 percent of the mobile backhaul connections market by 2015. As with all such forecasts, nobody would be too surprised if the actual magnitudes differed from projected levels.

In 2015, wireless backhaul and optical connections vastly outnumbered copper connections typically used for special access, on a global basis. The U.S. market traditionally has favored fixed network connections, though.

source: Infonetics Research
Still, end user demand is shifting towards Ethernet, and new platforms (cable access, fixed wireless), even as new options are coming (millimeter wave fixed wireless), Google Fiber, independent ISPs, municipal broadband).

To be sure, there still is demand for legacy T1 or DS3 connections, but Ethernet rapidly is eroding and replacing that demand. To the extent that a faster shift to all-IP networks is desirable, does it make sense to subsidize the legacy access methods?

Zayo, for example is one of the leading suppliers of backhaul circuits of these types in the U.S. market. “The number of DS3s sold by Zayo has declined from 3,569 in September 2013 to 2,772 in June 2015, a 23 percent drop in less than two years,” Entner notes. “For DS1s, that decline has been even more pronounced--from 3,569 to 2,772--a 38 percent decline from September 2013 to June 2015.”

That said, the Federal Communications Commission still argues that the product “special access” in all its forms represents between $25 billion and $40 billion in annual sales.

The new argument is that special access regulation is needed to support coming 5G and small cell networks. Others would argue Ethernet connections are going to be needed even there.

Some would argue that represents both retail sales to business customers as well as the smaller backhaul market (mobile tower backhaul, for example) that is much smaller, representing perhaps $8 billion worth--perhaps less--of annual revenue.




Freewheel to Shut Down

Freewheel, the Wi-Fi-only mobile service offered by Cablevision Systems Corp., is going to be shut down. What that means about consumer demand for any “mobile” service with connectivity limitations is the issue.

“Cordless” phones seem to have been well understood by consumers, who understood the value was “no cor d” while inside the home, talking on a fixed voice service.

“Mobile” phones likewise seem exceptionally well understood, offering nearly-ubiquitous communications “anywhere,” out and about or at home.

The issue, for some decades, is whether there is a substantial market for some form of service intermediate between “full mobility” and “cordless.”

Personal Handy Phone is the best historical example of a type of service that Freewheel resembled. The idea was a service that was “more than at-home cordless, but less than fully mobile.” Or, to put it another way, PHP was essentially a cordless phone that would work at home and also outdoors, with call handoff at pedestrian speeds.

So far, consumer demand has been quite mixed, and Freewheel only seems to confirm that, so far, there is not a big market for services that are in between at-home cordless and full mobile service.

In comparison, “Wi-Fi first” services are full “mobile” services, the only distinction being the preference for connecting to Wi-Fi, if available, before connecting to a mobile network.

Freewheel always was going to be different, since it--like Personal Handy Phone--was not going to work everywhere, but only where Wi-Fi was available, as it had no ability to connect to a mobile network.

So, at least so far, Freewheel has reinforced past experience with cordless and mobile phone service. People seem to understand and want to use both those “phone” modes, but few seem willing to embrace the PHP model.

Few now are old enough to remember it, but "Personal Communications Service" based on use of frequencies in the 1.8 GHz range, now used only for mobile service, once were envisioned as supporting PHP style services. It did not catch on, and PCS simply became a band of frequencies used for full mobile service.

Average Global Internet Connection Speed Up 12% Since Last Quarter

Almost nothing is more inexorable than an increase in global average connection speed from one quarter to the next. Akamai, for example, reports a  12 percent quarter over quarter increase in average global Internet connection speed  to 6.3 Mbps.

Global average peak connection speed increased 6.8 percent to 34.7 Mbps.

At a country/region level, South Korea continued to have the highest average Internet connection speed in the world at 29 Mbps, an 8.6 percent gain over the fourth quarter of 2015, while Singapore maintained its position as the country with the highest average peak connection speed at 146.9 Mbps, an 8.3 percent quarterly increase.


Globally, 4 Mbps broadband adoption came in at 73 percent, up 5.4 percent from the fourth quarter of 2015.

Average mobile connection speeds (aggregated at a country/region level) ranged from a high of 27.9 Mbps in the United Kingdom to a low of 2.2 Mbps in Algeria in the first quarter of 2016, while average peak mobile connection speeds ranged from 171.6 Mbps in Germany to 11.7 Mbps in Ghana.

Based on traffic data collected by Ericsson, the volume of mobile data traffic grew by 9.5 percent  over the previous quarter.

Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...