Many observers are hoping for a relatively-quick uptick in firm productivity and capabilities driven by 5G, edge computing, internet of things and artificial intelligence. Fewer likely believe 5G will positively affect gross revenues and profits from consumer mobility services. Patience is the watchword.
It often takes much longer to reap technology benefits than observers expect. Researchers call this the productivity paradox. Quite often, big new information technology projects or technologies fail to produce the expected gains, for longish periods of time, such as a decade to three decades.
In fact, some argue that a productivity boost between 1995 and 2000 was not enabled by information technology. But it also is likely the case that better information technology allows some firms to take market share from other firms, though overall productivity might not shift much, if at all.
Even though knowledge of electricity was widespread in the late 1800s, electric power technology did not significantly spread until around 1914 to 1917 in the United States. In fact, factories did not fully utilize electricity until the 1920s.
It took two to three decades before electricity was being used in a productive manner: about 40 years, as it turns out.
To note just one example, much of the current economic impact of “better computing and communications” is what many would have expected at the turn of the century, before the “dot com” meltdown. Amazon, cloud computing in general, Uber, Airbnb and the shift of internet activity to mobile use cases in general provide examples.
But that lag was more than 15 years in coming. Nor is that unusual. Many would note that similar lags in impact happened with enterprises invested in information technology in the 1980s and 1990s.
Investments do not always immediately translate into effective productivity results. This productivity paradox was apparent for much of the 1980s and 1990s, when one might have struggled to identify clear evidence of productivity gains from a rather massive investment in information technology.
In fact, diffusion of a new technology takes time precisely because people need time to learn how to use the technology, while organizations must retool the ways they work to incorporate the better technologies most productively.
Computing power in the U.S. economy increased by more than two orders of magnitude between 1970 and 1990, for example, yet productivity, especially in the service sector, stagnated).
And though it seems counter-intuitive, some argue the Internet has not clearly affected economy-wide productivity. But part of the problem is that it is impossible to capture productivity gains using our normal measuring tools when products or services have zero price. And much of the consumer internet is based on zero pricing.
Other traditional measures of growth also suffer when technology arguably improves efficiency and productivity (more produced by the same--or fewer--humans). Look only at the print content business, where revenues, profits and employment have plummeted in the internet era, even as the volume of content of all sorts has increased exponentially.
Or consider the impact of software on information technology. The Bureau of Labor Statistics estimates that employment in information technology was lower in 2018 than it was in 1998, despite the obvious increase in software-intensive life and business.
Gartner, for example, recently said that enterprises will have to wait twice as long as they expect to reap incremental revenue from technology investments.
Through 2021, incremental revenue from digital transformation initiatives is largely unlikely, Gartner researchers predict. That will not come as good news for executives hoping for revenue growth from repositioning existing business practices for digital delivery and operation.
On average, it will “take large traditional enterprises twice as long and cost twice as much as anticipated,” Gartner researchers predict.