Why AI Investment is Going to (Initially) Disappoint
Despite the promise of big data, industrial enterprises are struggling to maximize its value. A survey conducted by IDG showed that “extracting business value from that data is the biggest challenge the Industrial IoT presents.”
Why? Abundant data by itself solves nothing, says Jeremiah Stone, GM of Asset Performance Management at GE Digital.
Its unstructured nature, sheer volume, and variety exceed human capacity and traditional tools to organize it efficiently and at a cost which supports return on investment requirements, he argues.
At least so far, firms "rarely" have had clear success with big data or artificial intelligence projects. "Only 15 per cent of surveyed businesses report deploying big data projects to production,” says IDC analyst Merv Adrian.
We should not be surprised. Big waves of information technology investment have in the past taken quite some time to show up in the form of measurable productivity increases.
In fact, there was a clear productivity paradox when enterprises began to spend heavily on information technology in the 1980s.
“From 1978 through 1982 U.S. manufacturing productivity was essentially flat,” said Wickham Skinner, writing in the Harvard Business Review.
In fact, researchers have created a hypothesis about the application of IT for productivity: the Solow computer paradox. Yes, paradox.
Here’s the problem: the rule suggests that as more investment is made in information technology, worker productivity may go down instead of up.
Empirical evidence from the 1970s to the early 1990s fits the hypothesis.
Before investment in IT became widespread, the expected return on investment in terms of productivity was three percent to four percent, in line with what was seen in mechanization and automation of the farm and factory sectors.
When IT was applied over two decades from 1970 to 1990, the normal return on investment was only one percent.
This productivity paradox is not new. Information technology investments did not measurably help improve white collar job productivity for decades. In fact, it can be argued that researchers have failed to measure any improvement in productivity. So some might argue nearly all the investment has been wasted.
Some now argue there is a lag between the massive introduction of new information technology and measurable productivity results, and that this lag might conceivably take a decade or two decades to emerge.
The problem is that this is far outside the window for meaningful payback metrics conducted by virtually any private sector organization. That might suggest we inevitably will see disillusionment with the results of artificial intelligence investment.
One also can predict that many promising firms with good technology will fail to reach sustainability before they are acquired by bigger firms about to sustain the long wait to a payoff.
So it would be premature to say too much about when we will see the actual impact of widespread artificial intelligence application to business processes. It is possible to predict that, as was the case for earlier waves of IT investment, it does not help to automate existing processes.
Organizations have to recraft and create brand new business processes before the IT investment actually yields results.
One possibly mistaken idea is that productivity advances actually hinge on “human” processes.
Skinner argues that there is a “40 40 20” rule where it comes to measurable benefits. Roughly 40 percent of any manufacturing-based competitive advantage derives from long-term changes in manufacturing structure (decisions about the number, size, location, and capacity of facilities) and basic approaches in materials and workforce management.
Another 40 percent of improvement comes from major changes in equipment and process technology.
The final 20 percent of gain is produced by conventional approaches to productivity improvement (substitute capital for labor).
In other words, and colloquially, firms cannot “cut their way to success.” Quality, reliable delivery, short lead times, customer service, rapid product introduction, flexible capacity, and efficient capital deployment arguably were sources of business advantage in earlier waves of IT investment.
But the search for those values, not cost reduction, were the primary sources of advantage. The next wave will be the production of insights from huge amounts of unstructured data that allow accurate predictions to be made about when to conduct maintenance on machines, how to direct flows of people, vehicles, materials and goods, when medical attention is needed, what goods to stock, market and promote, and when.
Of course, there is another thesis about the productivity paradox. Perhaps we do not know how to quantify quality improvements wrought by application of the technology. The classic example is computers that cost about the same as they used to, but are orders of magnitude more powerful.
It is not so helpful, even if true, that we cannot measure quality improvements in some agreed-upon way that produce far better products sold at lower or same cost. Economies based on services have an even worse problem, since services productivity is both difficult and hard to quantify.
The bad news is that disappointment over the value of AI investments will inevitably result in disillusionment. And that condition might exist for quite some time, until most larger organizations have been able to recraft their processes in a way that builds directly on AI.