Wednesday, May 1, 2019

Will IoT Boost Productivity? How Long Will it Take?

The lag time between first deployment of a general-purpose technology (steam engine, railroad,, electricity, electronics, automation, automobile, the computer, the internet) and quantifiable productivity increases is not immediate, not clearly and unmistakably causal, and sometimes impossible to isolate from the impact of other general-purpose technologies.

That is important because we cannot determine whether important new technologies actually increase productivity--although people mostly assume it does--or not. Nor can we see with precision how long it will take: gains often take decades to appear in quantifiable form.

That is worth keeping in mind in assessing the return from internet of things, artificial intelligence, connected vehicles and so forth.

Consider the impact of electricity on agricultural productivity.

“While initial adoption offered direct benefits from 1915 to 1930, productivity grew at a faster rate beginning in 1935, as electricity, along with other inputs in the economy such as the personal automobile, enabled new, more efficient and effective ways of working,” the National Bureau of Economic Research says.  

There are at least two big problems with the “electricity caused productivity to rise” argument. The first is that other inputs also changed, so we cannot isolate any specific driver. Note that the automobile, also generally considered a general-purpose technology, also was introduced at the same time.

That is not to say correlations between important new technology and process efficiency are undetectable.

Looking only at use of machine learning, error rates in labeling the content of photos on ImageNet, a dataset of over 10 million images, have fallen from over 30 percent in 2010 to less than five percent in 2016 and most recently as low as 2.2 percent, say researchers working for NBER.

Likewise, error rates in voice recognition have decreased to 5.5 percent from 8.5 percent in 2017, for example.

At the same time, “there is little sign that they have yet affected aggregate productivity statistics,” the researchers note.  Labor productivity growth rates in a broad swath of developed economies fell in the mid-2000s and have stayed low since then.

“For example, aggregate labor productivity growth in the U.S. averaged only 1.3 percent per year from 2005 to 2016, less than half of the 2.8 percent annual growth rate sustained from 1995 to 2004,” NBER researchers say.


“Fully 28 of the 29 other countries for which the OECD has compiled productivity growth data saw similar decelerations,” they say. “The unweighted average annual labor productivity growth rates across these countries was 2.3 percent from 1995 to 2004 but only 1.1 percent from 2005 to 2015.”

So how do observers explain the apparent failure of big applications of technology to produce productivity gains? “False hope” is one explanation.

“The simplest possibility is that the optimism about the potential technologies is misplaced and unfounded,” NBER researchers say. Perhaps new technologies won’t be as transformative as many expect.

More compelling, perhaps, is our inability to measure the productivity gains. Many new technologies, like smartphones, online social networks, and downloadable media involve little monetary cost.

That poses an obvious challenge when only quantifiable price metrics can be used. A personal computer that costs 10 percent less, but supplies double the computing power or memory actually might be deemed a decrease in economic activity, for example.  

Technology improvements that boost qualitative power or potential utility might not show up in price metrics in a fully-capturable way, as imputed value is higher, but price lower. But we cannot measure higher possible value; only price changes.

Another argument is that the impact of potentially-transformative technologies is limited by limited diffusion (not all firms and industries use them equally well). In other words, the gains are not equally distributed. Some industries and firms seem to capture most of the benefits.

Perhaps the most-persuasive opinion is that it takes a considerable time to sufficiently harness the power of a new general-purpose technology, since whole business processed need to be created before the advantages can be reaped.

The bottom line: we assume IoT improves productivity, as we assume electricity and broadband also contribute. But we need to invest in a measured way, as the actual benefits might not show up for a decade or two.

That might be the case for new 5G-based enterprise and consumer use cases as well.

No comments:

Will AI Actually Boost Productivity and Consumer Demand? Maybe Not

A recent report by PwC suggests artificial intelligence will generate $15.7 trillion in economic impact to 2030. Most of us, reading, seein...