Showing posts sorted by date for query computing eras. Sort by relevance Show all posts
Showing posts sorted by date for query computing eras. Sort by relevance Show all posts

Wednesday, August 13, 2025

Computing has Shifted from Work to Life and Now Begins to Augment Life

I think we generally miss something important when pondering how artificial intelligence will shift job functions from repetitive, lower-order tasks to higher-order cognitive tasks, even displacing many cognitive tasks, with consequent impact on jobs. 


Across three major computing eras: the personal computer era (roughly 1970s–1990s); the internet era (1990s–2010s) and the coming AI era (2010s–present), computing's pervasiveness has increased steadily.


Where we first used PCs to accelerate routine work tasks ("doing things faster"), we later used the internet to accelerate knowledge acquisition ("learning things faster") and then playing, shopping and traveling, while demolishing many geographic barriers.  


The shift was from “computing for work” to “computing for life.”


But AI should be even more pervasive, allowing us to optimize outcomes ("doing things better"), and shifting computing from intentional interactions to anticipatory (autonomous) action. So computing shifts from tool to “collaborator.” PCs and software were tools we used. In the AI era computing will augment and amplify human capabilities. 


To be sure, we might argue that all general-purpose technologies have augmented human senses or capabilities in some way (muscles, sight, hearing, cognitive tasks, speech, transport, staying warm or cool). 


So the movement is something like “work to life to existence.” Sure, we can still ponder what AI means for work, or life. But that likely underplays the impact on normally esoteric thinking about what humans do that is uniquely human. 


AI arguably can automate intermediate cognitive tasks such as basic data analysis, customer service responses and routine decision-making. So yes, AI will reshape work. 


Cognitive Task

Example Tasks

Current AI Capabilities

Extent of Automation

Data Processing and Analysis

Data entry, basic statistical analysis, report generation

AI excels at processing large datasets, generating insights, and creating reports (e.g., tools like Power BI, Tableau with AI plugins, or custom ML models).

High: Routine data tasks are fully or near-fully automated. Human oversight needed for validation and complex interpretation.

Pattern Recognition

Fraud detection, image classification, trend identification

AI uses machine learning (e.g., neural networks) to identify patterns in financial transactions, medical imaging, or market trends with high accuracy.

High: AI often outperforms humans in speed and scale, but human judgment is required for context or anomalies.

Basic Decision-Making

Customer service responses, inventory management, scheduling

AI-powered chatbots (e.g., Zendesk, Intercom) handle routine queries; algorithms optimize schedules or stock levels.

Moderate to High: Routine decisions are automated, but complex or ambiguous cases require human intervention.

Content Generation

Writing emails, creating marketing copy, summarizing texts

Generative AI (e.g., GPT models, Jasper) produces coherent text, summaries, or creative content based on prompts.

Moderate: AI generates drafts or suggestions, but human editing is needed for nuance, tone, or originality.

Diagnostic Tasks

Medical diagnostics, legal research, technical troubleshooting

AI assists in diagnosing diseases (e.g., IBM Watson, Google Health), analyzing legal documents, or identifying system errors.

Moderate: AI provides accurate recommendations, but final diagnoses or decisions require human expertise.

Predictive Modeling

Sales forecasting, risk assessment, customer behavior prediction

AI models (e.g., regression, deep learning) predict outcomes based on historical data with high precision.

High: Predictions are automated, but humans must interpret results and make strategic decisions.

Language Translation and Processing

Real-time translation, sentiment analysis, speech-to-text

AI tools (e.g., Google Translate, DeepL) provide near-human-quality translations and analyze sentiment in texts or speech.

High: Routine translations are nearly fully automated; human input needed for cultural nuances or specialized contexts.

Routine Problem-Solving

Technical support queries, basic coding, process optimization

AI resolves common IT issues, generates simple code (e.g., GitHub Copilot), or optimizes workflows.

Moderate: AI handles standard cases, but novel or complex problems require human creativity and reasoning.


But AI will affect not only work, but almost all other elements of human life. In the PC era computing automated and digitized work and personal projects.


In the Internet era computing enabled new forms of creativity, commerce, and community.


In the AI era we’ll see augmented human intelligence, senses, and capabilities.


Also, compared to the earlier impact of PCs and the internet, it is possible that AI will produce outcomes sooner than has been the case in the past. 


Where we might argue that PCs produced widespread change over a two-decade or three-decade period, where the internet arguably produced fundamental changes over a two-decade period,, some believe AI will achieve widespread change in as little as a decade. 


The IBM PC, for example, was released in 1981. It wasn’t until about 2000 that half of U.S. households owned a PC. 


In 1983, perhaps 10 percent of U.S. homes owned a PC and about 14 percent of those homes used a modem to connect using the internet, according to Pew Research. At that point, it was all-text bulletin boards and the visual browser and multimedia internet had not yet been invented. 


It was not until 2000 or so that half of U.S. consumers said they used the internet. 


Year

PC Adoption (%)

Internet Adoption (%)

1995

36

14

2000

51

41.5

2010

76.7

71

2016

89.3

87

Wednesday, June 11, 2025

Why AI Era of Computing is Different

If we can say computing has moved through distinct eras, each with distinct properties, it is not unreasonable to predict that artificial intelligence represents the next era. And though earlier generations are normally defined by hardware, that is less true of more-recent eras, where virtualization is prevalent and the focus is more on applications than hardware. 


But AI might shift matters further. 


Era

Key Feature

Key Technologies

Mainframe Era (1950s–1970s)

Centralized computing

IBM mainframes

Personal Computing Era (1980s–1990s)

Decentralization, personal access

PCs, MS-DOS, Windows

Internet Era (1990s–2000s)

Connectivity, information access

Web browsers, search engines

Mobile & Cloud Era (2000s–2020s)

Always-on, distributed services

Smartphones, AWS, Google Cloud


The AI era should feature software “learning” more than “programming.” Where traditional software follows explicit rules, AI models learn from data, discovering patterns without being explicitly programmed.


AI systems can generalize from experience and sometimes operate autonomously, as in the case of self-driving cars, recommendation systems or robotic process automation.


Voice assistants, chatbots, and multimodal systems mark a transition to more human-centric interfaces, moving beyond keyboards and GUIs.


AI can be considered a distinct era of computing, not because it introduces new tools, but because it changes the nature of computing itself, from explicitly coded systems to systems that evolve and learn.


Tuesday, November 26, 2024

Will We Break Traditional Computing Era Leadership Paradigm?

What are the odds that the next Google, Meta or Amazon--big new leaders of new markets--will be one of the leaders of the present market,  breaking from historical patterns? 


Historically, we can argue that the leaders of each era of computing were different from the leaders of the prior era. The leaders in the mainframe era (1945-1980) included IBM, Honeywell and Burroughs. 


In the succeeding personal computer era, the leaders were Apple, Microsoft and Dell. 


The era that follows the “PC” period is more contested. Some might say it is the internet era. Others might say the mobile or cloud computing eras also followed since 2006 or 2007. It also is possible that mobile and cloud computing are merely evolutions of a single internet (or other name we do not universally agree upon) era. 


In the internet era, we might argue the leaders were Google, Amazon and Meta. Some might argue the internet era largely overlaps, since about 2007, with the mobile era, whose leaders might be said to include  Apple, Google (Android) and Samsung.


The cloud computing era might include Amazon Web Services, Microsoft Azure, Google Cloud. 


And that might suggest a possible outcome that reflects our current inability to define the present era (internet, mobile, cloud computing). The leaders in segments or eras tend to overlap with each other. That might suggest there are phases to a single era. 


Some believe the next era will center on artificial intelligence, perhaps led by generative AI frontier models. And one characteristic of the model business is its capital intensity. 


source: Pure AI 


And keep in mind that LLMs are updated, as are operating systems. Creating one version of a model necessarily includes t he necessity of updating that model every year to three years or so. 


Model Family

Company

Update Cycle

Notes

GPT (3, 4)

OpenAI

1-2 years

GPT-3 was released in 2020, GPT-4 in 2023

PaLM, Gemini

Google

1-2 years

PaLM released in 2022, Gemini in 2023

BERT

Google

2-3 years

Initial release in 2018, with subsequent variants

LLaMA

Meta

1-2 years

LLaMA 1 released in 2023, LLaMA 2 in 2023

Claude

Anthropic

6-12 months

Frequent iterative updates reported


And note: leaders include many of the same names we see in the internet, mobile or cloud computing “eras.” OpenAI is the most-prominent “new” name, but the others are familiar: Google, Meta, AWS and Microsoft,  for example. 


source: IoT Analytics 


And that suggests a possibility: the leaders of the generative AI era of computing might well be one or more of the firms said to lead in the internet, mobile or cloud computing eras as well.


That might break a pattern we have seen since the mainframe era. On the other hand, there is some divergence of opinion about which “era” computing now is in. But whether we focus on internet, cloud or mobility-based nomenclature, many of the leaders are the same. 


So though we might not know for some time to come, it is possible that the leaders of the internet era could be the leaders (mostly) of the next era, the exception being OpenAI. 


It might also be worth noting that since the PC arrived, eras have been defined by applications and platforms rather than hardware. But many observers might agree that a single computing era can last 30 years to 40  years. 


Wednesday, October 16, 2024

What "Killer App" Will Emerge from Generative AI?

 Agents are clearly a lead candidate for the artificial intelligence "killer app." Personalization of your digital experience is one thing; anticipation of your needs is something else. 


With the caveat that it is always possible there is no single and universal “killer app” in any computing era, it still is possible that one could emerge for generative artificial intelligence. 


Certainly, key or lead apps have been important in prior waves of computing development. Sometimes  the killer app is clear enough for end users and consumers. At other times it is the business or organization end users or business-to-business use cases that dominate. 


As a rule, B2B value was dominant in the mainframe and minicomputer eras. Since  then, virtually all killer apps can be identified by the consumer apps that surfaced. 


But some innovations, such as app stores or cloud computing, arguably were important as platforms and ways of doing things, rather than specific apps. 


Era/Platform

Killer App(s)

Rationale

Mainframe Era (1960s-1970s)

COBOL.  Batch Processing

Enabled large-scale business applications like payroll, banking, and insurance systems.

Minicomputer Era (1970s-1980s)

VAX/VMS, Accounting Systems

Brought computing power to smaller organizations, particularly in science, manufacturing, and finance.

Personal Computer (PC) Era (1980s-1990s)

VisiCalc (spreadsheet)

The first spreadsheet program, which revolutionized business and financial management.

PC Era (1990s)

Microsoft Office Suite (Word, Excel, etc.)

Dominated office productivity, becoming essential in business, education, and home environments.

PC Era (1990s)

Internet Browsers (Netscape, Internet Explorer)

Opened the gateway to the World Wide Web, fundamentally changing communication and information access.

Web 1.0 Era (late 1990s-2000s)

Email (e.g., AOL, Hotmail)

Email transformed personal and business communication, enabling near-instant global connectivity.

Web 1.0/2.0 Era (early 2000s)

Search Engines (Google)

Google’s search engine made finding information on the web faster and more accurate, changing web usability.

Mobile Era (2000s)

Text and Instant Messaging (WhatsApp and others)

Redefined personal communication with quick, accessible messaging on mobile phones.

Mobile App Era (late 2000s-2010s)

App Stores (Apple App Store, Google Play)

Created an ecosystem where developers could offer mobile apps, enabling smartphone adoption at scale.

Mobile App Era (2010s)

Social Media Apps (Facebook, Instagram)

Changed social interaction, media consumption, and online behavior globally.

Cloud Computing Era (2010s-present)

AWS, Microsoft Azure, Google Cloud

Enabled scalable, on-demand computing infrastructure, transforming how companies build and deploy services.

AI Era (2020s)?

Generative AI (ChatGPT, others)

Revolutionizing content creation, customer service, and automating complex cognitive tasks. Enable AI agents


We don’t yet know what killer apps could emerge in the AI era, but early on, generative AI might be a lead platform. Still, some believe AI agents could emerge as a potential killer app for GenAI.


Access Network Limitations are Not the Performance Gate, Anymore

In the communications connectivity business, mobile or fixed, “more bandwidth” is an unchallenged good. And, to be sure, higher speeds have ...