Showing posts sorted by date for query productivity paradox. Sort by relevance Show all posts
Showing posts sorted by date for query productivity paradox. Sort by relevance Show all posts

Thursday, September 11, 2025

70% of IT and AI Projects Fail for Simple Reasons

Information technology investments are often treated as purely technical endeavors rather than organizational transformations that require changes in processes, culture, and human behavior, which possibly explains the gap between IT investments and observed results. 


source: McKinsey 


Some of the exceptions might be retail; communications and media, where productivity seems higher. For communications and media, technology often creates the platform for services, delivering a higher degree of observable value. 


source: McKinsey 


Still, many studies suggest that IT projects have a high failure rate overall. 


Study/Source

Year

Key Finding

Sample Size/Scope

Failure Rate/Metric

Standish Group CHAOS Report

2020

Only 31% of IT projects are successful (on time, on budget, with required features)

50,000+ projects across multiple industries

69% challenged or failed

McKinsey Global Institute

2012

Large IT projects run 45% over budget and 7% over time, while delivering 56% less value than predicted

Analysis of 5,400 IT projects

17% of projects are "black swans" with cost overruns >200%

Harvard Business Review - Flyvbjerg & Budzier

2011

Average cost overrun for large IT projects is 27%, with one in six projects having cost overruns of 200%

Study of IT project performance patterns

16.7% massive overruns

PwC Global CEO Survey

2019

73% of CEOs believe their digital investments are not delivering expected returns

1,378 CEOs globally

73% not meeting ROI expectations

Deloitte Tech Trends

2021

70% of digital transformation initiatives fail to meet their goals

Survey of 1,000+ executives

70% failure to meet objectives

MIT Sloan - Brynjolfsson & Hitt

2003

IT productivity paradox: firms with higher IT spending don't always show proportional productivity gains

Longitudinal study of 527 large firms

Mixed correlation between IT spending and productivity

Gartner IT Spending Analysis

2019

85% of big data projects fail to deliver business value

Analysis of enterprise big data initiatives

85% failure rate

Accenture Technology Vision

2020

Only 37% of organizations successfully scale their digital pilots to enterprise-wide implementations

Survey of 4,000+ business and IT executives

63% fail to scale successfully

Boston Consulting Group

2018

70% of digital transformation efforts fall short of their goals

Analysis of transformation initiatives across industries

70% shortfall rate

KPMG Global CEO Outlook

2018

65% of CEOs question whether their technology investments create competitive advantage

Survey of 1,300 CEOs

65% uncertain about competitive value

IBM Institute for Business Value

2019

Organizations realize only 20% of anticipated benefits from AI investments

Study of AI implementation across enterprises

80% benefit shortfall

Forrester Research

2020

60% of customer experience technology investments fail to improve customer satisfaction scores

Analysis of CX technology implementations

60% fail to improve target metrics

EY Digital Transformation Study

2018

55% of digital transformation programs are abandoned before completion

Survey of 500+ executives across industries

55% abandonment rate

Capgemini Digital Transformation Institute

2017

Only 36% of organizations are digital transformation leaders achieving significant benefits

Study of 1,000+ organizations globally

64% are laggards or followers

McKinsey Technology Trends

2021

Cloud migration projects deliver only 65% of expected cost savings on average

Analysis of cloud transformation initiatives

35% savings shortfall

Tuesday, June 3, 2025

"Minimal" Economic Impact of AI Chatbots, Study Suggests

With the obvious caveat that investing in new technology often does not produce measurable immediate outcomes, a study of large language model economic outcomes in Denmark suggests very-slight outcomes. 


Indeed, the study authors say “AI chatbots have had no significant impact on earnings or recorded hours in any occupation.” 


The study published by the U.S. National Bureau of Economic Research involved two large-scale adoption surveys conducted in late 2023 and 2024 covering 11 occupations; 25,000 workers and 7,000 workplaces.


Productivity gains were said to be modest, with an average time savings of three percent. But the study notes that AI chatbots have created new job tasks for 8.4 percent of workers, including some who do not use the tools themselves.


Nor has there been any impact on worker earnings. “Workers overwhelmingly report no impact on earnings as of November 2024,” the study says. 


Nor do productivity gains seem to have much impact on earnings. “We estimate that only three to seven percent of workers’ productivity gains are passed through to higher earnings,” say authors Anders Humlum and Emilie Vestergaard.


“Comparing workplaces with high versus low rates of chatbot usage, we find no evidence that firms with greater adoption have experienced differential changes in total employment, wage bills, or retention of

incumbent workers,” the authors say. 


The authors also note that Denmark has institutional characteristics similar to those of the United States, with similar uptake of generative AI; how hiring and firing costs; decentralized wage bargaining and annual wage negotiations. 


The 11 occupations studied included accountants, customer support specialists, financial advisors, HR professionals, IT support specialists, journalists, legal professionals, marketing professionals, office clerks, software developers, and teachers.


The findings should not come as a surprise. The “productivity J-curve" suggests that initial investments in new technologies may temporarily suppress productivity before delivering long-term benefits.


Study

Technology Examined

Lag Time Observed

Key Findings

McKinsey Global Institute 1,5,7

Digital technologies, AI

Years to decades

Benefits emerge after business process redesign and "creative destruction." Historical parallels (e.g., electric power) show lags of decades. Generative AI may shorten lags to months or years.

CEPR Study on French Industrialization 3

General-purpose technologies

5–10 years

Firms delayed adoption due to uncertainty, and early adopters operated technologies inefficiently. Aggregate productivity gains materialized slowly as organizational practices evolved.

Stanford CS Analysis 4,5

IT investments

2–5 years

Executives reported 5-year lags for IT payoffs. Complementary investments and learning curves delayed measurable productivity growth.

Productivity Paradox Research 5

IT, automation

2–5 years

"Productivity J-curve" observed: short-term costs offset gains until workflows adapted. Measurable aggregate gains emerged in the 2000s from 1990s IT investments.

Brynjolfsson et al. (McKinsey) 7

Generative AI

Months to a few years

Shorter lag due to existing digital infrastructure, but still requires process redesign. Early adopters see inefficiencies before optimization.

Friday, May 30, 2025

Will AI Boost Productivity Enough to Pay for Universal Basic Income?

One important issue national policymakers will have to confront is what has to be done if artificial intelligence does lead to widespread job losses because machines really are able to do work that today is done by humans.


At a high enough level, social dislocations are inevitable. And much hinges on the unknowns of how much productivity can be lifted to pay for social programs offsetting the loss of jobs. But it seems fairly obvious that a huge increase in productivity will be required. 


One study suggests automation productivity gains at about five times to six times greater than what we have tended to see from earlier technology innovations. 


Study/Author(s)

Year

Key Contributions / AI Productivity Insights

Degree of AI Productivity Required for Robust UBI

Considerations/Caveats for UBI Implementation

Krohn, J.

2025

Analyzes AI advancements, focusing on the reduction in model size and cost of running LLMs, and the increasing capabilities of AI agents. Notes that in short time-horizon settings, top AI systems can outperform human experts.

While not providing a specific "degree" of productivity, Krohn (2025) suggests that the increasing efficiency and decreasing cost of AI models will lead to significant corporate AI adoption and productivity gains. This implies a future where AI's contribution to overall economic output is substantial enough to support broad social programs.

The study by Jon Krohn (2025) on "The "State of AI" Report 2025" focuses more on the current trends and capabilities of AI rather than directly calculating the precise productivity threshold for UBI. However, the reported exponential improvements in model efficiency, cost reduction, and agent performance strongly imply that AI is on a trajectory to generate the necessary wealth. The challenge then shifts from if AI can generate enough, to how that wealth is distributed.

Nayebi, A.

2025

Develops a closed-form condition for AI capital profits to sustainably finance UBI without additional taxes or new job creation in a Solow-Zeira economy. Examines how the AI capability threshold (productivity relative to pre-AI automation) varies.

AI systems must achieve approximately 5-6 times existing automation productivity to finance an 11%-of-GDP UBI, in a worst-case scenario where no new jobs are created. This threshold can be halved to 3 times existing automation productivity if public revenue share of AI capital is raised to 33% (from current 15%).

This study provides a concrete, model-based estimation. It highlights the importance of public revenue share from AI profits and market structure (monopolistic markets could make UBI easier to fund due to higher rents). Assumes a "worst-case scenario" of no new job creation.

Goldman Sachs Report

2023 (cited in various 2024/2025 discussions)

Projects that generative AI alone could automate 300 million full-time jobs globally, while also leading to a significant rise in global GDP (estimated at 7%).

Implicitly, the productivity gains from automating 300 million jobs and a 7% rise in global GDP are seen as the wellspring from which UBI could be funded. The sheer scale of potential productivity increase is a key factor.

Focuses on the potential for AI to create immense wealth, but the challenge remains in the distribution of that wealth. The report doesn't explicitly state the "degree of productivity" needed to fund UBI, but rather the overall economic impact.

Brynjolfsson, Li, and Raymond

2023 (cited by Brookings)

Showed that call center operators became 14% more productive when using generative AI, with gains over 30% for the least experienced workers. Also noted improved customer sentiment and lower employee attrition.

Specific percentage gains in productivity for certain tasks/occupations are reported. While not an economy-wide figure, these micro-level gains contribute to the larger picture of AI's productivity potential.

These are early, specific examples of productivity gains. Scaling these individual gains to a robust, economy-wide UBI requires further aggregation and consideration of broader economic impacts.

Korinek

2023 (cited by Brookings)

Estimates that economists can be 10-20% more productive using large language models.

Similar to Brynjolfsson et al., this provides a specific example of AI-driven productivity enhancement in a knowledge-based profession.

Like other micro-level studies, these gains need to be considered in the context of broader economic shifts and potential job displacement.

Santens, S.

2025 (various articles/discussions)

Argues that AI-driven automation is fundamentally different from previous shifts, automating cognitive labor, decision-making, and creativity, leading to productivity without widespread prosperity (i.e., increased inequality).

Argues that the current level of AI-driven productivity has already decoupled from wage growth, leading to massive inequality. Thus, the "degree of productivity" is already sufficient to address poverty, but the distribution mechanism is flawed.

Highlights that the issue might not be a lack of productivity from AI, but rather how the economic benefits are currently concentrated. Advocates for UBI as a tool to redistribute this wealth.

Bertello, G. P., & Almeida, T.

2025

Analyze the history of UBI and argue for its necessity as a new social contract in the age of AI and automation to address wage inequality, job insecurity, and widespread job losses.

Does not quantify a specific productivity degree, but implicitly argues that the disruptive potential of AI (leading to job displacement and enhanced productivity) necessitates UBI.

Emphasizes the need for sustainable funding, investment in education, and attention to social and psychological aspects, not just economic and labor market outcomes.


The core challenge is to ensure that the wealth generated by AI is sufficient to cover the UBI, without stifling innovation or leading to undesirable economic side effects such as inflation or capital flight. This implies that the rate of productivity growth must outpace the cost of UBI relative to the existing economy.


Even with significant productivity gains, the sheer cost of a meaningful UBI for an entire population is immense. For example, providing $12,000 per adult annually in the US would cost trillions of dollars.


Some observers will recall past expectations about automation or the cost of inputs (“energy too cheap to meter” in the case of nuclear energy). That has been a recurring theme for forecasters and futurists. 


Era/Technology

Key Expectations & Predictions (at the time)

"Too Cheap to Meter" Parallel for Computing/IT Goods

Underlying Technological Advances/Reasons for Optimism

Realization & Caveats

1960s-1970s: Early Mainframes & Minicomputers

Automation of administrative tasks, complex calculations, and inventory management leading to significant efficiency gains. Early visions of "paperless offices."

The idea that computation itself would become so affordable that it would be integrated into every aspect of business and decision-making, with the cost of individual calculations becoming negligible.

Moore's Law (1965): Gordon Moore's observation that the number of transistors on an integrated circuit doubles approximately every two years, leading to exponential increases in computing power at decreasing costs. This fundamental principle fueled optimism for decades.

While costs decreased, the scale of problems being tackled also grew. Early computers were expensive and complex, requiring specialized personnel and significant infrastructure, limiting the "too cheap" aspect for many goods. The focus was on efficiency rather than outright cost elimination for end products.

1980s-1990s: Personal Computers & Early Internet

Democratization of computing, empowering individuals and small businesses. Increased productivity in offices through word processing, spreadsheets, and desktop publishing. The rise of email and digital communication reduces communication costs.

The cost of information creation, storage, and transmission would become effectively free, allowing for near-instantaneous and borderless exchange of data and ideas. "Information Superhighway" promises a world connected.

Further advancements in microprocessors, memory, and storage: Smaller, more powerful, and cheaper components enabled the widespread adoption of PCs. Development of the Internet and World Wide Web: Protocols and infrastructure made global digital communication possible and increasingly affordable.

While costs plummeted for personal computing and basic internet access, the "free" aspect often overlooked the underlying infrastructure, software development, and the human capital required to leverage these tools effectively. The "productivity paradox" emerged in the 1990s, where significant IT investment didn't always translate to immediate, measurable economy-wide productivity gains.

2000s-2010s: Broadband, Cloud Computing, Mobile Devices

Ubiquitous connectivity, on-demand computing resources, and access to vast amounts of information from anywhere. Transformation of industries through e-commerce, digital media, and mobile applications.

The expectation that access to immense computing power and data storage through the cloud would be a utility-like service, with marginal costs approaching zero. Data itself would become "too cheap to meter" in terms of its accessibility and use.

Cloud Computing: Shifting from owning computing infrastructure to renting it on demand, drastically reducing upfront costs and scaling capabilities. Mobile Technology: Smartphones and tablets made computing and information access portable and pervasive. Big Data: The ability to collect, store, and analyze massive datasets, leading to new insights and efficiencies.

While cloud services significantly reduced the variable cost of computing for many, the total cost of complex IT systems, cybersecurity, and data management remained substantial. The increasing volume of data also presented new challenges and costs. The shift was more towards "cost-effective on-demand" rather than truly "too cheap to meter."

2020s (Current & Near Future): Generative AI

Automation of cognitive tasks, content generation, and sophisticated problem-solving. Significant productivity boosts across knowledge work, design, and even scientific research. Potential for AI to optimize production processes in unprecedented ways.

OpenAI CEO Sam Altman: Has stated that AI is heading toward being "too cheap to meter," particularly the cost of "intelligence" (i.e., AI inferences or "tokens"). Google CEO Sundar Pichai has echoed similar sentiments, stating "intelligence, just like air, too cheap to meter," given the rapid fall in the cost of generating AI outputs.

Large Language Models (LLMs) and Generative AI: Breakthroughs in AI architecture and training data have enabled highly capable models. Decreasing cost of AI inference: As models become more efficient and specialized hardware develops, the computational cost per AI operation is falling rapidly.

While the marginal cost of a single AI query or generated output is indeed plummeting, the initial development and training of these massive AI models are incredibly expensive. The ethical, societal, and regulatory costs associated with widespread AI adoption are also emerging. The focus is now on how this "cheap intelligence" can be effectively integrated to reduce the cost of final goods and services.


In all likelihood, AI will not reduce the net number of jobs as some fear, but also probably will not improve productivity enough to easily support new UBB policies. It might take some combination of productivity growth, elimination of existing welfare programs, new taxes or government funding to compensate for extensive AI-driven changes in an economy’s need for labor.


Yes, Follow the Data. Even if it Does Not Fit Your Agenda

When people argue we need to “follow the science” that should be true in all cases, not only in cases where the data fits one’s political pr...