Monday, January 26, 2026

Clear AI Productivity? Remember History: It Will Take Time

History is quite useful for many things. For example, when some argue that AI adoption still lags, that observation, even when accurate, ignores the general history of computing technology adoption, which is that it takes longer than most expect. 


Consider a widely-discussed MIT study that was also widely misinterpreted. Press reports said the study showed AI was not producing productivity gains at enterprises.


So all we really know is that pilot projects have not yet shown productivity gains at the whole-enterprise level. And how could they? 


Much has been made of a study suggesting 95 percent of enterprises deploying artificial intelligence are not seeing a return on investment.


There’s just one glaring problem: the report points out that just five percent of those entities have AI in a “production” stage. The rest are pilots or limited early deployments. 


That significant gap between AI experimentation and successful, large-scale deployment arguably explains most of the sensationalized claim that “only five percent of enterprises” are seeing return on AI investment. 


It would be much more accurate to say that most enterprises have not yet deployed AI at scale, and therefore we cannot yet ascertain potential impact. 


But that is not unusual for any important new computing technology. Adoption at scale takes time. 


Consider the adoption of personal computers, ignoring the early hobbyist phases prior to 1981, which would lengthen the adoption period. At best, 10-percent adoption happened in four years, but 50-percent adoption took 19 years. 


It took at least five years for the visual web to reach 10-percent adoption, and about a decade to reach 50-percent usage. 


For home broadband, using a very-conservative definition of “broadband,” (perhaps 1.5 Mbps up to perhaps 100 Mbps), it took seven years to reach half of U.S. homes.  


Technology

Commercial Start (Year)

Time to 10% Adoption

Time to 50% Adoption

The "Lag" Context

Personal Computer

1981 (IBM PC launch)

~4 Years (1985)

~19 Years (2000)

High Lag. Slowed by high cost ($1,500+), lack of connectivity (pre-internet), and steep learning curve (DOS/early Windows).

Internet

1991 (WWW available)

~5 Years (1996)

~10 Years (2001)

Medium Lag. Required physical infrastructure (cables/modems) and ISP subscription growth. "Network effects" accelerated it rapidly in the late 90s.

Broadband

~2000 (Cable/DSL)

~2 Years (2002)

~7 Years (2007)

Medium Lag. Replaced dial-up. Dependent on telecom providers upgrading last-mile infrastructure to homes.

Smartphone

2007 (iPhone launch)

~2 Years (2009)

~5-6 Years (2012-13)

Low Lag. Piggybacked on existing cellular networks. High replacement rate of mobile phones accelerated hardware turnover.

Tablet

2010 (iPad launch)

~2 Years (2012)

~5 Years (2015)

Low Lag. Benefited from the "post-PC" era ecosystem. Familiar interface (iOS/Android) meant zero learning curve for smartphone users.

Generative AI

2022 (ChatGPT launch)

< 1 Year (2023)

~2-3 Years (Proj. 2025)*

Near-Zero Lag. Instant global distribution via browser/app. "Freemium" models removed cost barriers. Adoption is currently outpacing the smartphone and internet.


The point is that widespread adoption of any popular and important consumer computing technology does take longer than we generally imagine. 


AI adoption is only at the very early stages. It will take some time for workflows to be redesigned; apps to be created and redesigned and user behavior to start to match the new capabilities. 


It is unreasonable to expect widespread evidence of productivity benefits so soon after introduction, even if new technologies now seemingly are adopted at a faster rate than prior innovations.


No comments:

Clear AI Productivity? Remember History: It Will Take Time

History is quite useful for many things. For example, when some argue that AI adoption still lags , that observation, even when accurate, ig...