Wednesday, June 11, 2025

Why AI Era of Computing is Different

If we can say computing has moved through distinct eras, each with distinct properties, it is not unreasonable to predict that artificial intelligence represents the next era. And though earlier generations are normally defined by hardware, that is less true of more-recent eras, where virtualization is prevalent and the focus is more on applications than hardware. 


But AI might shift matters further. 


Era

Key Feature

Key Technologies

Mainframe Era (1950s–1970s)

Centralized computing

IBM mainframes

Personal Computing Era (1980s–1990s)

Decentralization, personal access

PCs, MS-DOS, Windows

Internet Era (1990s–2000s)

Connectivity, information access

Web browsers, search engines

Mobile & Cloud Era (2000s–2020s)

Always-on, distributed services

Smartphones, AWS, Google Cloud


The AI era should feature software “learning” more than “programming.” Where traditional software follows explicit rules, AI models learn from data, discovering patterns without being explicitly programmed.


AI systems can generalize from experience and sometimes operate autonomously, as in the case of self-driving cars, recommendation systems or robotic process automation.


Voice assistants, chatbots, and multimodal systems mark a transition to more human-centric interfaces, moving beyond keyboards and GUIs.


AI can be considered a distinct era of computing, not because it introduces new tools, but because it changes the nature of computing itself, from explicitly coded systems to systems that evolve and learn.


No comments:

Some Problems Just Cannot be Fixed: Lumen Technologies, for Example

Some problems are nearly impossible to fix. Consider Lumen Technologies, a mashup of the former Level 3 Communications capacity business and...