Quite often, big new information technology projects or technologies fail to produce the expected gains. That “productivity paradox,” where high spending does not lead in any measurable way to productivity gains, is likely to happen with artificial intelligence and machine learning, at least in the early going. And that “early going” period can last far longer than many believe.
To note just one example, much of the current economic impact of “better computing and communications” is what many would have expected at the turn of the century, before the “dot com” meltdown. Amazon, cloud computing in general, Uber, Airbnb and the shift of internet activity to mobile use cases in general provide examples.
But that lag was more than 15 years in coming. Nor is that unusual. Many would note that similar lags in impact happened with enterprises invested in information technology in the 1980s and 1990s.
So prepare now: artificial intelligence and machine learning are eventually going to have the impact many now expect. It simply will take far longer than many expect.
No doubt, spending is growing. Some surveys suggest enterprises have dived into machine learning (artificial intelligence).
Half of those adopting machine learning are looking for insights they can use to improve their core businesses. About 46 percent report they are looking for ways to gain greater competitive advantage. Some 45 percent are looking for faster gleanings of insight. And 44 percent are looking at use of machine learning to help them develop new products.