It long has been conventional wisdom that up to 70 percent of innovation efforts and major information technology projects fail in significant ways, either failing to produce predicted gains, or producing a very-small level of results. If we assume applied artificial intelligence, virtual reality, metaverse, web3 or internet of things are “major IT projects,” we likewise should assume initial failure rates as high as 70 percent.
That does not mean ultimate success will fail to happen, only that failure rates, early on, will be quite high. As a corollary, we should continue to expect high rates of failure for companies and projects, early on. Venture capitalists will not be surprised, as they expect such high rates of failure when investing in startups.
But all of us need to remember that failure rates for innovation generally and major IT efforts specifically will have high failure rates of up to 70 percent. So steel yourself for bad news as major innovations are attempted in areas ranging from metaverse and web3 to cryptocurrency to AR, VR or even less “risky” efforts such as internet of things, network slicing, private networks or edge computing.
Gartner estimated in 2018 that through 2022, 85 percent of AI projects would deliver erroneous outcomes due to bias in data, algorithms or the teams responsible for managing them.
That is analogous to arguing that most AI projects will fail in some part. Seven out of 10 companies surveyed in one study report minimal or no impact from AI so far. The caveat is that many such big IT projects can take as much as a decade to produce quantifiable results.
Investing in more information technology has often and consistently failed to boost productivity, or appear to have done so only after about a decade of tracking. Some would argue the gains are there; just hard to measure, but the point is that progress often is hard to discern.
Still, the productivity paradox seems to exist. Before investment in IT became widespread, the expected return on investment in terms of productivity was three percent to four percent, in line with what was seen in mechanization and automation of the farm and factory sectors.
When IT was applied over two decades from 1970 to 1990, the normal return on investment was only one percent.
This productivity paradox is not new. Even when investment does eventually seem to produce improvements, if often takes a while to produce those results. So perhaps even AI project near-term failure might be seen as a success a decade or more later.
Sometimes measurable change takes longer. Information technology investments did not measurably help improve white collar job productivity for decades, for example. In fact, it can be argued that researchers have failed to measure any improvement in productivity. So some might argue nearly all the investment has been wasted.
Most might simply agree there is a lag between the massive introduction of new information technology and measurable productivity results.
Most of us likely assume quality broadband “must” boost productivity. Except when it does not. The consensus view on broadband access for business is that it leads to higher productivity.
But a study by Ireland’s Economic and Social Research Institute finds “small positive associations between broadband and firms’ productivity levels, none of these effects are statistically significant.”
Among the 90 percent of companies that have made some investment in AI, fewer than 40 percent report business gains from AI in the past three years, for example.