Where are the expected big winners from AI, over the longer term, beyond the immediate winners such as Nvidia in the graphics processor unit market? To put it another way, when will some firm, in some industry (existing or emerging) have an "iPhone moment" when the value of AI crystalizes in a really-big way?
In other words, even if most firms and people are eventually expected to profit from using AI, will there be new Googles, Facebooks, Amazons, and if, so, where will we find them?
Here's the gist of the problem: ultimate winners that fundamentally reimagined or created entire industries were not obvious. In the internet era, experts thought their ideas were impractical or impossible.
Amazon initially was viewed as just an online bookstore.
When Larry Page and Sergey Brin first developed their search algorithm at Stanford, most people didn't understand how a search engine could become a multi-billion dollar company, as even the revenue model was unclear.
Netflix was often mocked by traditional media and entertainment companies. Blockbuster (the video rental retailer) had an opportunity to buy Netflix for $50 million in 2000 and declined. Blockbuster is gone; Netflix leads the streaming video market.
When first launched, Airbnb would have seemed to many a risky concept, especially for hosts renting space in their own lived-in homes.
The idea of getting into a stranger's personal car as a transportation method might have seemed radical and unsafe when Uber first launched.
Many thought the concept of transferring money online seemed dangerous (PayPal).
The point is that big winners are often hard to discern. And even when a field is considered promising, eventual winners often look just like all their other competitors, at first.
Right now, most of us seem to agree that infrastructure (GPUs; GPU as a service; AI as a service; transport and data center capacity) is the place where significant gains are obvious.
Beyond that, there is much less certainty.
We might not experience anything “disruptive” or “revolutionary” for some time. Instead, we’ll see small improvements in most things we already do.
And then, at some point, we are likely to experience something really new, even if we cannot envision it, yet.
Most of us are experientially used to the idea of “quantum change,” a sudden, significant, and often transformative shift in a system, process, or state. Think of a tea kettle on a heated stove. As the temperature of the water rises, the water remains liquid. But at one point, the water changes state, and becomes steam.
Or think of water in an ice cube tray, being chilled in a freezer. For a long time, the water remains a liquid. But at some definable point, it changes state, and becomes a solid.
That is probably how artificial intelligence will feature hundreds of evolutionary changes in apps and consumer experiences that will finally culminate in a qualitative change.
In the history of computing, that “quantity becomes quality” process has been seen in part because new technologies reach a critical mass. Some might say these quantum-style changes result from “tipping points” where the value of some innovation triggers widespread usage.
Early PCs in the 1970s and early 1980s were niche products, primarily for hobbyists, academics, and businesses. Not until user-friendly graphical interfaces were available did PCs seem to gain traction.
It might be hard to imagine, but GUIs that allow users to interact with devices using visual elements such as icons, buttons, windows, and menus, was a huge advance over command line interfaces. Pointing devices such as a mouse, touchpad, or touch screen are far more intuitive for consumers than CLIs that require users to memorize and type commands.
In the early 1990s, the internet was mostly used by academics and technologists and was a text-based medium. The advent of the World Wide Web, graphical web browsers (such as Netscape Navigator) and commercial internet service providers in the mid-1990s made the internet user-friendly and accessible to the general public.
Likewise, early smartphones (BlackBerry, PalmPilot) were primarily tools for business professionals, using keyboard interfaces and without easy internet access. The Apple iPhone, using a new “touch” interface, with full internet access, changed all that.
The point is that what we are likely to see with AI implementations for mobile and other devices is an evolutionary accumulation of features with possibly one huge interface breakthrough or use case that adds so much value that most consumers will adopt it.
What is less clear are the tipping point triggers. In the past, a valuable use case sometimes was the driver. In other cases it seems the intuitive interface was key. For smartphones it possibly was a combination of elegant interface; multiple-functions (internet access in the purse or pocket; camera replacement; watch replacement; PC replacement; plus voice and texting)
The point is that it is hard to point to a single “tipping point” value that made smartphones a mass market product. While no single app universally drove adoption, several categories of apps--social media, messaging, navigation, games, utility and productivity-- all combined with an intuitive user interface, app stores and full internet access to make the smartphone a mass market product.
Regarding consumer AI integrations across apps and devices, we might see a similar process. AI will be integrated in any evolutionary way across most consumer experiences. But then one particular crystallization event (use case, interface, form factor or something else) will be the trigger for mass adoption.
For a long time, we’ll be aware of incremental changes in how AI is applied to devices and apps. The changes will be useful but evolutionary.
But, eventually, some crystallization event will occur, producing a qualitative change, as all the various capabilities are combined in some new way.
“AI,” by itself, is not likely to spark a huge qualitative shift in consumer behavior or demand. Instead, a gradual accumulation of changes including AI will set the stage for something quite new to emerge.
Users and consumers are unlikely to see disruptive new possibilities for some time, until ecosystems are more-fully built out and then some unexpected innovation finally creates a tipping point moment such as the “iPhone moment,” a transformative, game-changing event or innovation that disrupts an industry or fundamentally alters how people interact with technology, products, or services.
It might be worth noting that such "iPhone moments" often involve combining pre-existing technologies in a novel way. The Tesla Model S, ChatGPT, Netflix, social media and search might be other examples.
We’ll just have to keep watching.
No comments:
Post a Comment