Showing posts sorted by relevance for query productivity paradox. Sort by date Show all posts
Showing posts sorted by relevance for query productivity paradox. Sort by date Show all posts

Saturday, December 12, 2020

Work from Home and the Solow Productivity Paradox

It is easy, but perhaps wrong, to attribute many types of change to “Covid-19” or the responses made to the pandemic. To be sure, the prevalence of work-from-home, learn-from-home modes required by governments to slow the spread was a precipitating event. It arguably speeded up trends already in place and convinced larger numbers of people and firms to consider joining trends, such as substituting Zoom video conferences for older meeting formats. 


With good reason, increased amounts of work from home are viewed as a permanent shift in venues where many types of work are done on a routine basis. The conventional wisdom is that hybrid models will dominate, with more workers spending parts of the week working from home, rather than “in the office.”


source: Researchgate  


But it is worth noting that this “remote work” trend has been in place and growing for more than 50 years, though we used to call it “telecommuting.” 


source: Federal Reserve Bank of St. Louis 


The point is that forecasters have expected a huge increase in remote work patterns for quite some time. 

Source


So it might be safe to say that belief in permanent change of remote work arrangements will happen. But the change might be more gradual than some believe. 


There might be unexpected barriers in the form of cost issues, as has proven true in the past, for at least some firms. 


More importantly, it is hard enough to measure office worker productivity at all. It will be devilishly difficult to determine what impact on productivity remote work in large doses might produce. 


Obviously, at some level of productivity (higher, same, lower), many types of work can be performed remotely, at home. 


source: McKinsey


But productivity is an issue. To be sure, most of us assume that higher investment and use of technology improves productivity. That might not be true, or true only under some circumstances. 


Investing in more information technology has often and consistently failed to boost productivity.  Others would argue the gains are there; just hard to measure.  There is evidence to support either conclusion. 


Most of us likely assume quality broadband “must” boost productivity. Except when it does not. The consensus view on broadband access for business is that it leads to higher productivity. 


But a study by Ireland’s Economic and Social Research Institute finds “small positive associations between broadband and firms’ productivity levels, none of these effects are statistically significant.”


“We also find no significant effect looking across all service sector firms taken together,” ESRI notes. “These results are consistent with those of other recent research that suggests the benefits of broadband for productivity depend heavily upon sectoral and firm characteristics rather than representing a generalised effect.”


“Overall, it seems that the benefits of broadband to particular local areas may vary substantially depending upon the sectoral mix of local firms and the availability of related inputs such as highly educated labour and appropriate management,” says ESRI.


Most of us are hopeful about the value of internet of things. But productivity always is hard to measure, and is harder when many inputs change simultaneously. Consider the impact of electricity on agricultural productivity.


“While initial adoption offered direct benefits from 1915 to 1930, productivity grew at a faster rate beginning in 1935, as electricity, along with other inputs in the economy such as the personal automobile, enabled new, more efficient and effective ways of working,” the National Bureau of Economic Research says.  


There are at least two big problems with the “electricity caused productivity to rise” argument. The first is that other inputs also changed, so we cannot isolate any specific driver. Note that the automobile, also generally considered a general-purpose technology, also was introduced at the same time.


Since 1970, global productivity growth has slowed, despite an increasingly application of technology in the economy overall, starting especially in the 1980s. 

 

A corollary: has information technology boosted living standards? Not so much,  some say. The absence of huge productivity gains has created what economists call the “productivity paradox.”


Basically, the paradox is that the official statistics have not borne out the productivity improvements expected from new technology.

 

Still, the productivity paradox seems to exist. Before investment in IT became widespread, the expected return on investment in terms of productivity was three percent to four percent, in line with what was seen in mechanization and automation of the farm and factory sectors.


When IT was applied over two decades from 1970 to 1990, the normal return on investment was only one percent.


This productivity paradox is not new. Information technology investments did not measurably help improve white collar job productivity for decades. In fact, it can be argued that researchers have failed to measure any improvement in productivity. So some might argue nearly all the investment has been wasted.


Some now argue there is a lag between the massive introduction of new information technology and measurable productivity results, and that this lag might conceivably take a decade or two decades to emerge.


Work from home trends were catalyzed by the pandemic, to be sure. Many underlying rates of change were accelerated. But the underlying remote work trends were there for decades, and always have been expected to grow sharply. 


Whether that is good, bad or indifferent for productivity remains to be seen. The Solow productivity paradox suggests that applied technology can boost--or lower--productivity. Though perhaps shocking, it appears that technology adoption productivity impact can be negative.

Sunday, August 1, 2021

Prepare for Digital Transformation Disappointment

Prepare for digital transformation disappointment. The investments firms and organizations are rushing to make to “digitally transform” will largely fail, history suggests. For starters, the theory is that whole business processes can be transformed.


But those are the thorniest, toughest problems to solve in the short to medium term, as human organization and habits must change, not simply the computer tools people use. 


Secondly, DX necessarily involves big changes in how things are done, requiring significant application of computing technology. Historically, big information technology projects have  failed about 70 percent of the time.


Finally, understanding how best to use a new technology approach takes some time, as suggested by prior technology paradoxes. 


Many technologists noted the lag of productivity growth in the 1970s and 1980s as computer technology was introduced. In the 1970s and 1980s, business investment in computer technology were increasing by more than 20 percent per year. But productivity growth fell, instead of increasing. 


So the productivity paradox is not new.  Massive investments in technology do not always result in measurable gains. In fact, sometimes negative productivity results. 


Information technology investments did not measurably help improve white collar job productivity for decades in the 1980s and earlier.  In fact, it can be argued that researchers have failed to measure any improvement in productivity. So some might argue nearly all the investment has been wasted.


Some now argue there is a similar lag between the massive introduction of new information technology and measurable productivity results, and that this lag might conceivably take a decade or two decades to emerge. 


The Solow productivity paradox suggests that applied technology can boost--or lower--productivity. Though perhaps shocking, it appears that technology adoption productivity impact can be negative


The productivity paradox was what we began to call it. In fact, investing in more information technology has often and consistently failed to boost productivity. Others would argue the gains are there; just hard to measure. Still, it is hard to claim improvement when we cannot measure it. 


Most of us are hopeful about the value of internet of things. But productivity always is hard to measure, and is harder when many inputs change simultaneously. Consider the impact of electricity on agricultural productivity.


“While initial adoption offered direct benefits from 1915 to 1930, productivity grew at a faster rate beginning in 1935, as electricity, along with other inputs in the economy such as the personal automobile, enabled new, more efficient and effective ways of working,” the National Bureau of Economic Research says.  


There are at least two big problems with the “electricity caused productivity   to rise” argument. The first is that other inputs also changed, so we cannot isolate any specific driver. Note that the automobile, also generally considered a general-purpose technology, also was introduced at the same time.


Since 1970, global productivity growth has slowed, despite an increasingly application of technology in the economy overall, starting especially in the 1980s. “From 1978 through 1982 U.S. manufacturing productivity was essentially flat,” said Wickham Skinner, writing in the Harvard Business Review. 


Skinner argues that there is a “40 40 20” rule where it comes to measurable IT investment benefits. Roughly 40 percent of any manufacturing-based competitive advantage derives from long-term changes in manufacturing structure (decisions about the number, size, location, and capacity of facilities) and basic approaches in materials and workforce management.


Another 40 percent of improvement comes from major changes in equipment and process technology.


The final 20 percent of gain is produced by conventional approaches to productivity improvement (substitute capital for labor).


Cloud computing also is viewed as something of a disappointment by C suite executives, as important as it is.  

 

A corollary: has information technology boosted living standards? Not so much,  some say.


By the late 1990s, increased computing power combined with the Internet to create a new period of productivity growth that seemed more durable. By 2004, productivity growth had slowed again to its earlier lethargic pace. 


Today, despite very real advances in processing speed, broadband penetration, artificial intelligence and other things, we seem to be in the midst of a second productivity paradox in which we see digital technology everywhere except in the economic statistics.


Despite the promise of big data, industrial enterprises are struggling to maximize its value.  A survey conducted by IDG showed that “extracting business value from that data is the biggest challenge the Industrial IoT presents.”


Why? Abundant data by itself solves nothing, says Jeremiah Stone, GM of Asset Performance Management at GE Digital.


Its unstructured nature, sheer volume, and variety exceed human capacity and traditional tools to organize it efficiently and at a cost which supports return on investment requirements, he argues.


At least so far, firms  "rarely" have had clear success with big data or artificial intelligence projects. "Only 15 percent of surveyed businesses report deploying big data projects to production,” says IDC analyst Merv Adrian.


So we might as well be prepared for a similar wave of disappointment over digital transformation. The payoff might be a decade or more into the future, for firms investing now.


Friday, October 1, 2021

Is "Digital Transformation" Simply the Latest Buzzword for Decades Worth of Applied Digital Technology?

Skeptics or cynics might argue that much of what passes for today’s digital transformation is simply the latest buzzword for applying technology to business processes. The buzzword desired outcomes might include agility, personalization, automation, data-driven decision making, improved customer experience or any other set of information technology outcomes. 


source: Fujitsu 


Of course, businesses and organizations and consumers have been adding digital products to their everyday lives for decades. And it would be hard to clearly differentiate any of the desired outcomes of technology deployment since the mid-1980s with the outcomes of digital transformation


source: Fujitsu 


Of course, the problem is that all information technology becomes commoditized. Any single firm would gain sustainable advantage if it were the only firm in its industry to adopt a particular technology.


The problem is that never is possible. Eventually, all competitors in any industry have access to, and use, all the relevant technologies. Though efficiency or effectiveness might arguably be improved, it does so for all competitors in the market, eventually negating any first-mover advantage a single firm might have tried to gain. 


The other problem is that applying technology does not often seem to yield tangible advantages in terms of productivity. This productivity paradox has been noted since the 1980s, when enterprises began to apply lots of digital technology to their businesses. 


Many technologists noted the lag of productivity growth in the 1970s and 1980s as computer technology was introduced. In the 1970s and 1980s, business investment in computer technology were increasing by more than 20 percent per year. But productivity growth fell, instead of increasing. 


So the productivity paradox is not new.  Massive investments in technology do not always result in measurable gains. In fact, sometimes negative productivity results. 


Information technology investments did not measurably help improve white collar job productivity for decades in the 1980s and earlier.  In fact, it can be argued that researchers have failed to measure any improvement in productivity. So some might argue nearly all the investment has been wasted.


Some now argue there is a similar lag between the massive introduction of new information technology and measurable productivity results, and that this lag might conceivably take a decade or two decades to emerge. 


The Solow productivity paradox suggests that applied technology can boost--or lower--productivity. Though perhaps shocking, it appears that technology adoption productivity impact can be negative


The productivity paradox was what we began to call it. In fact, investing in more information technology has often and consistently failed to boost productivity. Others would argue the gains are there; just hard to measure. Still, it is hard to claim improvement when we cannot measure it. 


Wednesday, June 7, 2017

Why AI Investment is Going to (Initially) Disappoint

Despite the promise of big data, industrial enterprises are struggling to maximize its value.  A survey conducted by IDG showed that “extracting business value from that data is the biggest challenge the Industrial IoT presents.”

Why? Abundant data by itself solves nothing, says Jeremiah Stone, GM of Asset Performance Management at GE Digital.

Its unstructured nature, sheer volume, and variety exceed human capacity and traditional tools to organize it efficiently and at a cost which supports return on investment requirements, he argues.

At least so far, firms  "rarely" have had clear success with big data or artificial intelligence projects. "Only 15 per cent of surveyed businesses report deploying big data projects to production,” says IDC analyst Merv Adrian.

We should not be surprised. Big waves of information technology investment have in the past taken quite some time to show up in the form of measurable productivity increases.

In fact, there was a clear productivity paradox when enterprises began to spend heavily on information technology in the 1980s.

“From 1978 through 1982 U.S. manufacturing productivity was essentially flat,” said Wickham Skinner, writing in the Harvard Business Review.

In fact, researchers have created a hypothesis about the application of IT for productivity: the Solow computer paradox. Yes, paradox.

Here’s the problem: the rule suggests that as more investment is made in information technology, worker productivity may go down instead of up.

Empirical evidence from the 1970s to the early 1990s fits the hypothesis.  

Before investment in IT became widespread, the expected return on investment in terms of productivity was three percent to four percent, in line with what was seen in mechanization and automation of the farm and factory sectors.

When IT was applied over two decades from 1970 to 1990, the normal return on investment was only one percent.

This productivity paradox is not new. Information technology investments did not measurably help improve white collar job productivity for decades. In fact, it can be argued that researchers have failed to measure any improvement in productivity. So some might argue nearly all the investment has been wasted.

Some now argue there is a lag between the massive introduction of new information technology and measurable productivity results, and that this lag might conceivably take a decade or two decades to emerge.

The problem is that this is far outside the window for meaningful payback metrics conducted by virtually any private sector organization. That might suggest we inevitably will see disillusionment with the results of artificial intelligence investment.

One also can predict that many promising firms with good technology will fail to reach sustainability before they are acquired by bigger firms about to sustain the long wait to a payoff.

So it would be premature to say too much about when we will see the actual impact of widespread artificial intelligence application to business processes. It is possible to predict that, as was the case for earlier waves of IT investment, it does not help to automate existing processes.

Organizations have to recraft and create brand new business processes before the IT investment actually yields results.

One possibly mistaken idea is that productivity advances actually hinge on “human” processes.

Skinner argues that there is a “40 40 20” rule where it comes to measurable benefits. Roughly 40 percent of any manufacturing-based competitive advantage derives from long-term changes in manufacturing structure (decisions about the number, size, location, and capacity of facilities) and basic approaches in materials and workforce management.

Another 40 percent of improvement comes from major changes in equipment and process technology.

The final 20 percent of gain is produced by conventional approaches to productivity improvement (substitute capital for labor).

In other words, and colloquially, firms cannot “cut their way to success.” Quality, reliable delivery, short lead times, customer service, rapid product introduction, flexible capacity, and efficient capital deployment arguably were sources of business advantage in earlier waves of IT investment.

But the search for those values, not cost reduction, were the primary sources of advantage. The next wave will be the production of insights from huge amounts of unstructured data that allow accurate predictions to be made about when to conduct maintenance on machines, how to direct flows of people, vehicles, materials and goods, when medical attention is needed, what goods to stock, market and promote, and when.

Of course, there is another thesis about the productivity paradox. Perhaps we do not know how to quantify quality improvements wrought by application of the technology. The classic example is computers that cost about the same as they used to, but are orders of magnitude more powerful.

It is not so helpful, even if true, that we cannot measure quality improvements in some agreed-upon way that produce far better products sold at lower or same cost. Economies based on services have an even worse problem, since services productivity is both difficult and hard to quantify.

The bad news is that disappointment over the value of AI investments will inevitably result in disillusionment. And that condition might exist for quite some time, until most larger organizations have been able to recraft their processes in a way that builds directly on AI.


Thursday, June 17, 2021

CxOs Disappointed (So Far) by Cloud Computing Value

It should come as no surprise that CxO expectations of cloud computing payback lag expectations in the areas of resilience; agility; decision making; innovation; customer experience; profits; talent recruitment and retention; costs or reputation, for example. 


All of those business processes are shaped by many other inputs than mere applied technology. And the general rule with any important new technology is that the value is not recognized until core business processes are reshaped to take advantage of the new technology. That is as likely to happen with cloud computing as with any other important new tools. 


Any major shift in technology and related business processes takes time. So much time that there often is a “productivity paradox” where investments do not seem to make much difference in outcomes for a decade or more. 


Nokia has noted that manufacturing productivity since the 1980s has been slight, in the range of one percent per year growth, despite all the information technology applied to manufacturing. 

source: PwC 


Despite the promise of big data, industrial enterprises are struggling to maximize its value.  A survey conducted by IDG showed that “extracting business value from that data is the biggest challenge the Industrial IoT presents.”


Why? Abundant data by itself solves nothing, says Jeremiah Stone, GM of Asset Performance Management at GE Digital. At least one study suggests similar findings for broadband internet access as well. 


The consensus view on broadband access for business is that it leads to higher productivity. But a study by Ireland’s Economic and Social Research Institute finds “small positive associations between broadband and firms’ productivity levels, none of these effects are statistically significant.”


“We also find no significant effect looking across all service sector firms taken together,” ESRI notes. “These results are consistent with those of other recent research that suggests the benefits of broadband for productivity depend heavily upon sectoral and firm characteristics rather than representing a generalised effect.”


“Overall, it seems that the benefits of broadband to particular local areas may vary substantially depending upon the sectoral mix of local firms and the availability of related inputs such as highly educated labour and appropriate management,” says ESRI.


 Big waves of information technology investment have in the past taken quite some time to show up in the form of measurable productivity increases.


In fact, there was a clear productivity paradox when enterprises began to spend heavily on information technology in the 1980s.


“From 1978 through 1982 U.S. manufacturing productivity was essentially flat,” said Wickham Skinner, writing in the Harvard Business Review.


In fact, researchers have created a hypothesis about the application of IT for productivity: the Solow computer paradox. 


Here’s the problem: the rule suggests that as more investment is made in information technology, worker productivity may go down instead of up.


Empirical evidence from the 1970s to the early 1990s fits the hypothesis.  


Before investment in IT became widespread, the expected return on investment in terms of productivity was three percent to four percent, in line with what was seen in mechanization and automation of the farm and factory sectors.


When IT was applied over two decades from 1970 to 1990, the normal return on investment was only one percent. 


This productivity paradox is not new. Information technology investments did not measurably help improve white collar job productivity for decades.


To be sure, some argue that the issue is our inability to measure productivity gains. It is happening, but we are unable to measure it, many would argue. That argument will not win many supporters in the CxO suites. 


Still, the disappointment is to be expected. It will take time to reap the measurable benefits of cloud, 5G, edge computing, internet of things or any other major new technology.


AI Wiill Indeed Wreck Havoc in Some Industries

Creative workers are right to worry about the impact of artificial intelligence on jobs within the industry, just as creative workers were r...