Tuesday, September 1, 2015

Internet of Things--as an Idea--is 16 Years Old

Kevin Ashton, many suggest, coined  the phrase Internet of Things in 1999.

The basic concept remains the same: ‘If we had computers that knew everything there was to know about things, using data they gathered without any help from us, we would be able to track and count everything, greatly reducing waste, loss and cost,” he said. “We would know when things needed replacing, repairing or recalling, and whether they were fresh or past their best.”

“In the twentieth century, computers were brains without senses: they only knew what we told them,” Ashton said. “In the twenty-first century, because of the Internet of Things, computers can sense things for themselves.”

The point, should it bear repeating, is that major and successful innovations often take quite a long time to move from conception to mass adoption. As much attention as now gets paid to IoT, we are 16 years out from inception.

Many predict quite substantial diffusion by 2025. That would mean a quarter century from “idea” to “widespread commercial adoption.”

That is not unusual, even if we often point to the rapid adoption of new and fundamental technologies ranging from use of personal computers to use of the Internet.

Consider the automated teller machine, one example of a useful and now ubiquitous technology routinely used by consumers.

ATM card adoption provides one example, where "decades" is a reasonable way of describing adoption of some new technologies, even those that arguably are quite useful.

Debit cards provide another example. It can take two decades for adoption to reach half of U.S. households, for example.  

You might call those examples a case of the normal product life cycle for any successful product. In consumer electronics, for example, an inflection point leading to  mass adoption tends to happen when adoption hits about 10 percent of homes.

Determining how long the ramp up to 10 percent might take is an imperfect art. Much depends on when one starts the clock. Do you start from “first discussion of the concept” or “first effort to sell to the mass market” as the starting point?

Even when you use the latter start point, it can take three to 10 years to reach the inflection point of 10 percent household adoption.

Compounding the problem is that IoT will develop separately in each industry. And, in some cases, such as the energy industry, sensor networks have been used for many years. Has the clock already started, or will there come some point where we say a genuine IoT platform has begun to replace the older sensor networks? It is a judgment call.

The other issue is definitional as well: are smartwatches “IoT” or not? Some of us might say “no,” since IoT centrally s about autonomous machines, not computing devices used by humans in a direct sense (PCs, tablets, smartphones, game players, book readers, smart watches or eyeglasses).

The larger point is that IoT, by one possible measure, already has taken as much as 16 years to reach the commercial deployment stage. Where that has happened, we will be watching for an inflection point.

To be sure, it does not make sense to mechanically or rigidly forecast adoption of machine-to-machine services and products using the history of consumer electronics product adoption.  

But the principle is valid: even important and highly-significant innovations, leading to creation of big markets, take time to develop. “Decades” is not an unreasonable expectation for widespread adoption.

No comments:

Agentic AI Could Change User Interface (Again)

The annual letter penned by Satya Nadella, Microsoft CEO, points out the hoped-for value of artificial intelligence agents which “can take a...