Sunday, August 1, 2021

Prepare for Digital Transformation Disappointment

Prepare for digital transformation disappointment. The investments firms and organizations are rushing to make to “digitally transform” will largely fail, history suggests. For starters, the theory is that whole business processes can be transformed.


But those are the thorniest, toughest problems to solve in the short to medium term, as human organization and habits must change, not simply the computer tools people use. 


Secondly, DX necessarily involves big changes in how things are done, requiring significant application of computing technology. Historically, big information technology projects have  failed about 70 percent of the time.


Finally, understanding how best to use a new technology approach takes some time, as suggested by prior technology paradoxes. 


Many technologists noted the lag of productivity growth in the 1970s and 1980s as computer technology was introduced. In the 1970s and 1980s, business investment in computer technology were increasing by more than 20 percent per year. But productivity growth fell, instead of increasing. 


So the productivity paradox is not new.  Massive investments in technology do not always result in measurable gains. In fact, sometimes negative productivity results. 


Information technology investments did not measurably help improve white collar job productivity for decades in the 1980s and earlier.  In fact, it can be argued that researchers have failed to measure any improvement in productivity. So some might argue nearly all the investment has been wasted.


Some now argue there is a similar lag between the massive introduction of new information technology and measurable productivity results, and that this lag might conceivably take a decade or two decades to emerge. 


The Solow productivity paradox suggests that applied technology can boost--or lower--productivity. Though perhaps shocking, it appears that technology adoption productivity impact can be negative


The productivity paradox was what we began to call it. In fact, investing in more information technology has often and consistently failed to boost productivity. Others would argue the gains are there; just hard to measure. Still, it is hard to claim improvement when we cannot measure it. 


Most of us are hopeful about the value of internet of things. But productivity always is hard to measure, and is harder when many inputs change simultaneously. Consider the impact of electricity on agricultural productivity.


“While initial adoption offered direct benefits from 1915 to 1930, productivity grew at a faster rate beginning in 1935, as electricity, along with other inputs in the economy such as the personal automobile, enabled new, more efficient and effective ways of working,” the National Bureau of Economic Research says.  


There are at least two big problems with the “electricity caused productivity   to rise” argument. The first is that other inputs also changed, so we cannot isolate any specific driver. Note that the automobile, also generally considered a general-purpose technology, also was introduced at the same time.


Since 1970, global productivity growth has slowed, despite an increasingly application of technology in the economy overall, starting especially in the 1980s. “From 1978 through 1982 U.S. manufacturing productivity was essentially flat,” said Wickham Skinner, writing in the Harvard Business Review. 


Skinner argues that there is a “40 40 20” rule where it comes to measurable IT investment benefits. Roughly 40 percent of any manufacturing-based competitive advantage derives from long-term changes in manufacturing structure (decisions about the number, size, location, and capacity of facilities) and basic approaches in materials and workforce management.


Another 40 percent of improvement comes from major changes in equipment and process technology.


The final 20 percent of gain is produced by conventional approaches to productivity improvement (substitute capital for labor).


Cloud computing also is viewed as something of a disappointment by C suite executives, as important as it is.  

 

A corollary: has information technology boosted living standards? Not so much,  some say.


By the late 1990s, increased computing power combined with the Internet to create a new period of productivity growth that seemed more durable. By 2004, productivity growth had slowed again to its earlier lethargic pace. 


Today, despite very real advances in processing speed, broadband penetration, artificial intelligence and other things, we seem to be in the midst of a second productivity paradox in which we see digital technology everywhere except in the economic statistics.


Despite the promise of big data, industrial enterprises are struggling to maximize its value.  A survey conducted by IDG showed that “extracting business value from that data is the biggest challenge the Industrial IoT presents.”


Why? Abundant data by itself solves nothing, says Jeremiah Stone, GM of Asset Performance Management at GE Digital.


Its unstructured nature, sheer volume, and variety exceed human capacity and traditional tools to organize it efficiently and at a cost which supports return on investment requirements, he argues.


At least so far, firms  "rarely" have had clear success with big data or artificial intelligence projects. "Only 15 percent of surveyed businesses report deploying big data projects to production,” says IDC analyst Merv Adrian.


So we might as well be prepared for a similar wave of disappointment over digital transformation. The payoff might be a decade or more into the future, for firms investing now.


No comments:

Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...