Wednesday, August 13, 2025

Computing has Shifted from Work to Life and Now Begins to Augment Life

I think we generally miss something important when pondering how artificial intelligence will shift job functions from repetitive, lower-order tasks to higher-order cognitive tasks, even displacing many cognitive tasks, with consequent impact on jobs. 


Across three major computing eras: the personal computer era (roughly 1970s–1990s); the internet era (1990s–2010s) and the coming AI era (2010s–present), computing's pervasiveness has increased steadily.


Where we first used PCs to accelerate routine work tasks ("doing things faster"), we later used the internet to accelerate knowledge acquisition ("learning things faster") and then playing, shopping and traveling, while demolishing many geographic barriers.  


The shift was from “computing for work” to “computing for life.”


But AI should be even more pervasive, allowing us to optimize outcomes ("doing things better"), and shifting computing from intentional interactions to anticipatory (autonomous) action. So computing shifts from tool to “collaborator.” PCs and software were tools we used. In the AI era computing will augment and amplify human capabilities. 


To be sure, we might argue that all general-purpose technologies have augmented human senses or capabilities in some way (muscles, sight, hearing, cognitive tasks, speech, transport, staying warm or cool). 


So the movement is something like “work to life to existence.” Sure, we can still ponder what AI means for work, or life. But that likely underplays the impact on normally esoteric thinking about what humans do that is uniquely human. 


AI arguably can automate intermediate cognitive tasks such as basic data analysis, customer service responses and routine decision-making. So yes, AI will reshape work. 


Cognitive Task

Example Tasks

Current AI Capabilities

Extent of Automation

Data Processing and Analysis

Data entry, basic statistical analysis, report generation

AI excels at processing large datasets, generating insights, and creating reports (e.g., tools like Power BI, Tableau with AI plugins, or custom ML models).

High: Routine data tasks are fully or near-fully automated. Human oversight needed for validation and complex interpretation.

Pattern Recognition

Fraud detection, image classification, trend identification

AI uses machine learning (e.g., neural networks) to identify patterns in financial transactions, medical imaging, or market trends with high accuracy.

High: AI often outperforms humans in speed and scale, but human judgment is required for context or anomalies.

Basic Decision-Making

Customer service responses, inventory management, scheduling

AI-powered chatbots (e.g., Zendesk, Intercom) handle routine queries; algorithms optimize schedules or stock levels.

Moderate to High: Routine decisions are automated, but complex or ambiguous cases require human intervention.

Content Generation

Writing emails, creating marketing copy, summarizing texts

Generative AI (e.g., GPT models, Jasper) produces coherent text, summaries, or creative content based on prompts.

Moderate: AI generates drafts or suggestions, but human editing is needed for nuance, tone, or originality.

Diagnostic Tasks

Medical diagnostics, legal research, technical troubleshooting

AI assists in diagnosing diseases (e.g., IBM Watson, Google Health), analyzing legal documents, or identifying system errors.

Moderate: AI provides accurate recommendations, but final diagnoses or decisions require human expertise.

Predictive Modeling

Sales forecasting, risk assessment, customer behavior prediction

AI models (e.g., regression, deep learning) predict outcomes based on historical data with high precision.

High: Predictions are automated, but humans must interpret results and make strategic decisions.

Language Translation and Processing

Real-time translation, sentiment analysis, speech-to-text

AI tools (e.g., Google Translate, DeepL) provide near-human-quality translations and analyze sentiment in texts or speech.

High: Routine translations are nearly fully automated; human input needed for cultural nuances or specialized contexts.

Routine Problem-Solving

Technical support queries, basic coding, process optimization

AI resolves common IT issues, generates simple code (e.g., GitHub Copilot), or optimizes workflows.

Moderate: AI handles standard cases, but novel or complex problems require human creativity and reasoning.


But AI will affect not only work, but almost all other elements of human life. In the PC era computing automated and digitized work and personal projects.


In the Internet era computing enabled new forms of creativity, commerce, and community.


In the AI era we’ll see augmented human intelligence, senses, and capabilities.


Also, compared to the earlier impact of PCs and the internet, it is possible that AI will produce outcomes sooner than has been the case in the past. 


Where we might argue that PCs produced widespread change over a two-decade or three-decade period, where the internet arguably produced fundamental changes over a two-decade period,, some believe AI will achieve widespread change in as little as a decade. 


The IBM PC, for example, was released in 1981. It wasn’t until about 2000 that half of U.S. households owned a PC. 


In 1983, perhaps 10 percent of U.S. homes owned a PC and about 14 percent of those homes used a modem to connect using the internet, according to Pew Research. At that point, it was all-text bulletin boards and the visual browser and multimedia internet had not yet been invented. 


It was not until 2000 or so that half of U.S. consumers said they used the internet. 


Year

PC Adoption (%)

Internet Adoption (%)

1995

36

14

2000

51

41.5

2010

76.7

71

2016

89.3

87

No comments:

Yes, Follow the Data. Even if it Does Not Fit Your Agenda

When people argue we need to “follow the science” that should be true in all cases, not only in cases where the data fits one’s political pr...