Experience (some might say “history”) is a highly-underrated analytical tool, even if most of us have only a few decades of experience in any single industry or industry segment to draw upon.
Experience would eventually impress upon you that few important new technologies ever develop as fast as observers predict. But for truly important technologies, that lagging adoption in the early days is later matched by adoption that exceeds forecasts. In other words, adoption tends to be non-linear.
All that can be lost when time frames are too compressed: then every innovation seems to have a linear and “rapid adoption” curve. But “decades” to a “couple of decades” and sometimes even “a few decades” is the right timeframe for significant adoption of some ideas and technologies.
Some might predict that the “Internet of Things,” even in the most-advanced industrial segments or vertical application classes, might well take two decades to reach significant adoption, assuming the turn of the century is when the phrase “Internet of Things” happened.
Even that might be too optimistic, as some of us will recall talk of connected vending machines in the 1980s. By that measure we are in the fourth decades of conceptual thinking about what we would now call an IoT application for vending machines.
The more complex the ecosystem, the longer it takes. Device adoption tends to happen faster: it is a “simple” matter of large numbers of people buying a tool. When attitudes need to change, and trust established, a decade can pass before 10 percent of people will adopt an important new technology. Use of debit cards and automated teller machines had that character, for example.
The basic concept remains the same: ‘If we had computers that knew everything there was to know about things, using data they gathered without any help from us, we would be able to track and count everything, greatly reducing waste, loss and cost,” he said. “We would know when things needed replacing, repairing or recalling, and whether they were fresh or past their best.”
“In the twentieth century, computers were brains without senses: they only knew what we told them,” Ashton said. “In the twenty-first century, because of the Internet of Things, computers can sense things for themselves.”
The point, should it bear repeating, is that major and successful innovations often take quite a long time to move from conception to mass adoption. As much attention as now gets paid to IoT, we are 16 years out from inception.
Many predict quite substantial diffusion by 2025. That would mean a quarter century from “idea” to “widespread commercial adoption.”
That is not unusual, even if we often point to the rapid adoption of new and fundamental technologies ranging from use of personal computers to use of the Internet.
Consider the automated teller machine, one example of a useful and now ubiquitous technology routinely used by consumers.
ATM card adoption provides one example, where "decades" is a reasonable way of describing adoption of some new technologies, even those that arguably are quite useful.
Debit cards provide another example. It can take two decades for adoption to reach half of U.S. households, for example.
IoT represents a very-complicated ecosystem, with the sustainable business model being among the developments required to propel further development. Yes, hardware and software development is required. But the speed of that development is propelled by creation of viable business models to support actors making big capital investments to satisfy demand.
Many point out that traffic and parking are the sorts of problems IoT can help solve. All true. The issue is whether--and how fast--business models can develop to fully fund the deployment of the extensive networks and devices (including automobiles) able to take advantage of IoT-enabled transportation and vehicle parking.
All of that likely means that IoT adoption by even 10 percent of actors in an application universe will take longer than most believe. Experience is the teacher, in that regard.