Thursday, January 9, 2025

We Have to Expect 70% of AI Use Cases to "Fail"

A healthy dose of humility typically is good advice for anybody who attempts to forecast the technology future, for the simple reason that we are so often wrong. 


That advice--to be humble about what could happen--might well apply to artificial intelligence as well, even if there are reasons to argue AI will have more impact that most other computing technologies. 


Consider success rates for new consumer electronics products, where success rates (wide adoption) can be as low as five percent. 


Title

Date

Publisher

Key Conclusions

Why Do Most New Consumer Electronics Products Fail?

July 3, 2024

Supply Chain Resources Group (SCRG)

Many failures stem from poor supplier selection, manufacturing quality issues, and inadequate oversight of offshore teams. Recommendations include rigorous quality control.

20+ Product Launch Statistics You Should Know in 2024

2024

G2 (learn.g2.com)

Reports a staggering 95% failure rate for newly launched products and highlights that only 20% of products survive longer than two years.

The Real Rate of New Product Failure

Various

Engineering Unleashed

Empirical studies from 1945–2004 report a failure rate of 30–49%, depending on the industry. Consumer electronics often face challenges in consistent quality and market fit.

Navigating the Pitfalls: Why New Consumer Electronics Often Fail

July 2024

SCRG

Focuses on manufacturing and supply chain challenges. Suggests steps to overcome these issues, including diversification and on-site audits.


Also, there often is a tendency to believe that the “best technology” will win, when past experience suggests there are all sorts of important drivers of adoption beyond “quality” of technology. Betamax had higher quality than did VHS, but VHS became the industry standard. LaserDisc had higher image quality than did DVDs, but lost out as the industry standard. 


Personal digital assistants never caught on. Virtually all music players other than the Apple iPod failed to gain wide acceptance, and even the iPod lost its value proposition once smartphones provided the same functionality. 


And, so far, 3D TVs and “smart glasses” have failed in the marketplace, though companies keep trying to create the replacement category for the smartphone. So far, smart watches are companions to smartphones, not really replacements, for example. 


Product

Era of Hype

Description

Reason for Failure

Betamax

1980s

Sony’s video cassette format, competing with VHS.

Lost to VHS due to shorter recording times and higher costs despite superior quality.

LaserDisc

1980s-1990s

Optical disc format for home video offering higher quality than VHS.

Expensive players, limited movie selection, and competition from DVDs.

Apple Newton

1993-1998

Early Personal Digital Assistant (PDA) by Apple.

High price, unreliable handwriting recognition, and competition from Palm Pilot. But PDAs failed, as a consumer or business product

Microsoft Zune

2006-2011

Microsoft’s portable media player to compete with the iPod.

Late to market, weak ecosystem, and inferior design compared to the iPod.

Google Glass

2013-2015

Augmented reality smart glasses with voice control and a head-up display.

Privacy concerns, high price, limited functionality, and cultural backlash ("Glassholes").

3D Televisions

2010s

Televisions offering a 3D viewing experience with special glasses.

High costs, lack of content, discomfort using glasses, and lack of compelling use cases.

Segway Personal Transporter

Early 2000s

Self-balancing personal vehicle, envisioned as a transformative urban transport solution.

High cost, impractical design for cities, and lack of consumer demand.

Blockchain for Everything

Late 2010s-2020s

Promised to revolutionize industries like healthcare, supply chains, and voting.

Limited scalability, regulatory issues, and lack of practical applications beyond cryptocurrency.

QR Code Payment Systems (Early)

2000s

Early QR code-based payments hyped for mobile commerce.

Poor user experience, limited smartphone penetration, and lack of retailer adoption in earlier phases.

Virtual Reality (Early Attempts)

1990s

Early VR systems like Nintendo’s Virtual Boy, promising immersive gaming experiences.

Poor graphics, discomfort (headaches/eye strain), and bulky equipment.

HD DVD

Mid-2000s

Competed with Blu-ray for the next-gen physical media format.

Lost to Blu-ray due to broader industry support and greater storage capacity.

Iridium Satellite Phones

Late 1990s

Global satellite phone network offering coverage in remote areas.

High cost of devices and calls, poor market timing, and competition from cellular networks.

MiniDisc

1990s-2000s

Sony’s digital audio format positioned as a replacement for CDs and cassettes.

Lost to MP3 players due to limited adoption and the rise of digital downloads.


Already, we might argue that AI has eclipsed the Metaverse, Blockchain, visual search, and even some other promising technologies because it has proven to be more versatile, immediately applicable, and capable of creating tangible benefits across a wider range of industries. 


We might also argue that is because AI seems to be a general-purpose technology, not simply a “new technology.” One characteristic of GPTs is that they wind up affecting almost all industries and life in general. 


Technology

Era of Hype

Description

Reason for Failure

Betamax

1980s

Sony’s video cassette format, competing with VHS.

Lost to VHS due to shorter recording times and higher costs despite superior quality.

LaserDisc

1980s-1990s

Optical disc format for home video offering higher quality than VHS.

Expensive players, limited movie selection, and competition from DVDs.

Apple Newton

1993-1998

Early Personal Digital Assistant (PDA) by Apple.

High price, unreliable handwriting recognition, and competition from Palm Pilot.

Microsoft Zune

2006-2011

Microsoft’s portable media player to compete with the iPod.

Late to market, weak ecosystem, and inferior design compared to the iPod.

Google Glass

2013-2015

Augmented reality smart glasses with voice control and a head-up display.

Privacy concerns, high price, limited functionality, and cultural backlash ("Glassholes").

3D Televisions

2010s

Televisions offering a 3D viewing experience with special glasses.

High costs, lack of content, discomfort using glasses, and lack of compelling use cases.

Segway Personal Transporter

Early 2000s

Self-balancing personal vehicle, envisioned as a transformative urban transport solution.

High cost, impractical design for cities, and lack of consumer demand.

Blockchain for Everything

Late 2010s-2020s

Promised to revolutionize industries like healthcare, supply chains, and voting.

Limited scalability, regulatory issues, and lack of practical applications beyond cryptocurrency.

QR Code Payment Systems (Early)

2000s

Early QR code-based payments hyped for mobile commerce.

Poor user experience, limited smartphone penetration, and lack of retailer adoption in earlier phases.

Virtual Reality (Early Attempts)

1990s

Early VR systems like Nintendo’s Virtual Boy, promising immersive gaming experiences.

Poor graphics, discomfort (headaches/eye strain), and bulky equipment.

HD DVD

Mid-2000s

Competed with Blu-ray for the next-gen physical media format.

Lost to Blu-ray due to broader industry support and greater storage capacity.

Iridium Satellite Phones

Late 1990s

Global satellite phone network offering coverage in remote areas.

High cost of devices and calls, poor market timing, and competition from cellular networks.

MiniDisc

1990s-2000s

Sony’s digital audio format positioned as a replacement for CDs and cassettes.

Lost to MP3 players due to limited adoption and the rise of digital downloads.





Technology

Era of Discovery

Description

Widespread Impact

Fire

~1.7 million years ago

Controlled use of fire for cooking, warmth, and protection.

Enabled cooking (improving nutrition), extended human activity into colder climates, and spurred toolmaking.

The Wheel

~3500 BCE

Cylindrical device enabling transportation and machinery.

Revolutionized transport (carts, chariots) and mechanical systems (pulleys, gears).

Writing

~3100 BCE

System for recording language using symbols (e.g., cuneiform, hieroglyphs).

Enabled record-keeping, legal systems, cultural preservation, and advanced trade networks.

Printing Press

1440 CE

Device for mass-producing written materials, invented by Johannes Gutenberg.

Accelerated knowledge dissemination, literacy, and cultural movements like the Renaissance and Reformation.

Electricity

18th-19th century

Understanding and harnessing electrical energy.

Foundation for modern lighting, appliances, communication (telegraph, telephone), and industrial machinery.

Steam Engine

Late 17th century

Engine using steam to generate mechanical work, pioneered by James Watt.

Powered the Industrial Revolution, transforming manufacturing, transport (railroads, ships), and urbanization.

Radio and Television

Early 20th century

Wireless communication (radio) and mass visual media (television).

Unified global communication, entertainment, advertising, and information dissemination.

Personal Computers

1970s

Small, affordable computing devices for individual use.

Revolutionized workplaces, education, entertainment, and communication, enabling the Information Age.

Internet

1980s-1990s

Global network of interconnected computers enabling communication and data exchange.

Created the digital economy, reshaped communication (email, social media), and democratized information.

Smartphones

2007 onward

Multifunctional mobile devices combining telecommunication, computing, and media.

Redefined daily life, enabling ubiquitous connectivity, mobile commerce, and digital ecosystems.

Artificial Intelligence

21st century

Systems mimicking cognitive functions (e.g., learning, problem-solving).

Transforming industries from healthcare to logistics, raising ethical considerations, and enabling automation.


The point is that we should expect 70 percent of AI-enabled innovations to fail in some major way. That doesn't mean outright inability to "work," but that the innovation fails to move the needle on value.

That might mean an innovation arguably "helps" in some way, but does not add value great term than the cost to implement. 

Wednesday, January 8, 2025

AI--Among Other Things--Is Changing the Person-Machine Interface

 Person-Machine interfaces have changed over the past 60 years, it is clear enough, and generative artificial intelligence seems to  be prepping the way for another evolution.

Among the changes AI will bring:


To be sure, many of those trends and interface directions are already present in some form. But AI should enable interactions that are “easier” because they are “smarter.” An analogy is the shift from command line interfaces to graphical user interfaces; the browser; the touch interface or voice interactions. 


Generative AI adds a shift to natural language interactions with computing devices and apps. 


This should enable more people to act functionally as coders, without the requirement for extensive coding knowledge. So low-code or no-code tools should be possible, allowing non-technical users to customize and build their own features. 


Also, all sorts of formerly-arduous tasks (writing code, conducting research) should be automated, essentially creating capabilities that might formerly required staffs of people and fair amounts of work input. 


One (quite old at this point) example is batch processing. My first coding was done on a time-shared mainframe (you had to sign up for a specific time to use the machine). 


We’d submit our tasks or jobs as a stack or deck of punch cards. Then we’d wait for the results, picking up a printed output of results at some later time. It was cumbersome, time-consuming, error-prone, limited and unfriendly. 


By way of illustration, an IBM System/370 had a central processing unit operating at up to 5 MHz, with 8 MB of memory. It could handle thousands of “floating point” additions per second. 


An iPhone 16 Pro CPU operates at GHz speeds. The iPhone has at least 1000 times more random access memory than the top mainframe models of the 1970s.


The iPhone's Neural Engine can perform 35 trillion operations per second, a capability that didn't exist in 1970s mainframes.


The iPhone achieves this performance in a handheld device, while 1970s mainframes required large rooms and extensive cooling systems. 


The iPhone also can handle a wide range of tasks, from complex computations to graphics rendering, surpassing the specialized functions of early mainframes, with text and numeric output only. Keep in mind that with batch processing, there was zero visual feedback on any specific job or task. Nor was there direct interface with the machine (everything was mediated by the computing center staff). 


Since that time we have added the ability to directly use a terminal or screen and give programs instructions in a direct way (command line). Then we got graphical user interfaces, which gave us windows, icons, menus, and pointers, as well as the mouse for navigation (we used “up” and “down,” “left” and “right” arrow keys before GUIs. It was way more intuitive. 


More recently, the web browser has become a major interface, as has “touch” and “speech” and “gestures.”


The point is that user interfaces matter, and that the history of user interfaces is a story of increasing accessibility and intuitiveness. AI will provide the next advance in interfaces.


NBER Study Suggests Limited AI Chatbot Impact on Earnings, Productivity

A study of artificial intelligence chatbot impact on labor markets in Denmark suggests the economic impact is “minimal.” Indeed, the study a...