Lots of people in their roles as retail investors are hearing lots about “artificial intelligence winners” these days, and much of the analysis is sound enough. There will be opportunities for firms and industries to benefit from AI growth.
Even if relatively few of us invest at the angel round or are venture capitalists, most of us might also agree that AI seems a fruitful area for investment, from infrastructure (GPUs; GPU as a service; AI as a service; transport and data center capacity) to software.
Likewise, most of us are, or expect soon to be, users of AI features in our e-commerce; social media; messaging; search; smartphone; PC and entertainment experiences.
Most of those experiences are going to be quite incremental and evolutionary in terms of benefit. Personalization will be more intensive and precise, for example.
But we might not experience anything “disruptive” or “revolutionary” for some time. Instead, we’ll see small improvements in most things we already do. And then, at some point, we are likely to experience something really new, even if we cannot envision it, yet.
Most of us are experientially used to the idea of “quantum change,” a sudden, significant, and often transformative shift in a system, process, or state. Think of a tea kettle on a heated stove. As the temperature of the water rises, the water remains liquid. But at one point, the water changes state, and becomes steam.
Or think of water in an ice cube tray, being chilled in a freezer. For a long time, the water remains a liquid. But at some definable point, it changes state, and becomes a solid.
That is probably how artificial intelligence will feature hundreds of evolutionary changes in apps and consumer experiences that will finally culminate in a qualitative change.
In the history of computing, that “quantity becomes quality” process has been seen in part because new technologies reach a critical mass. Some might say these quantum-style changes result from “tipping points” where the value of some innovation triggers widespread usage.
Early PCs in the 1970s and early 1980s were niche products, primarily for hobbyists, academics, and businesses. Not until user-friendly graphical interfaces were available did PCs seem to gain traction.
It might be hard to imagine, but GUIs that allow users to interact with devices using visual elements such as icons, buttons, windows, and menus, was a huge advance over command line interfaces. Pointing devices such as a mouse, touchpad, or touch screen are far more intuitive for consumers than CLIs that require users to memorize and type commands.
In the early 1990s, the internet was mostly used by academics and technologists and was a text-based medium. The advent of the World Wide Web, graphical web browsers (such as Netscape Navigator) and commercial internet service providers in the mid-1990s made the internet user-friendly and accessible to the general public.
Likewise, early smartphones (BlackBerry, PalmPilot) were primarily tools for business professionals, using keyboard interfaces and without easy internet access. The Apple iPhone, using a new “touch” interface, with full internet access, changed all that.
The point is that what we are likely to see with AI implementations for mobile and other devices is an evolutionary accumulation of features with possibly one huge interface breakthrough or use case that adds so much value that most consumers will adopt it.
What is less clear are the tipping point triggers. In the past, a valuable use case sometimes was the driver. In other cases it seems the intuitive interface was key. For smartphones it possibly was a combination of elegant interface; multiple-functions (internet access in the purse or pocket; camera replacement; watch replacement; PC replacement; plus voice and texting)
The point is that it is hard to point to a single “tipping point” value that made smartphones a mass market product. While no single app universally drove adoption, several categories of apps--social media, messaging, navigation, games, utility and productivity-- all combined with an intuitive user interface, app stores and full internet access to make the smartphone a mass market product.
Regarding consumer AI integrations across apps and devices, we might see a similar process. AI will be integrated in any evolutionary way across most consumer experiences. But then one particular crystallization event (use case, interface, form factor or something else) will be the trigger for mass adoption.
The point is that underlying details of the infrastructure(operating systems, chipsets) do not drive end user adoption. What we tend to see is that some easy to use, valuable use case or value proposition suddenly emerges after a long period of gradual improvements.
For a long time, we’ll be aware of incremental changes in how AI is applied to devices and apps. The changes will be useful but evolutionary.
But, eventually, some crystallization event will occur, producing a qualitative change, as all the various capabilities are combined in some new way.
“AI,” by itself, is not likely to spark a huge qualitative shift in consumer behavior or demand. Instead, a gradual accumulation of changes including AI will set the stage for something quite new to emerge.
That is not to deny the important changes in ways we find things, shop, communicate, learn or play. For suppliers, it will matter whether AI displaces some amount of search; shifts retail volume or social media personalization.
But users and consumers are unlikely to see disruptive new possibilities for some time, until ecosystems are more-fully built out and then some unexpected innovation finally creates a tipping point moment such as the “iPhone moment,” a transformative, game-changing event or innovation that disrupts an industry or fundamentally alters how people interact with technology, products, or services.
It might be worth noting that such "iPhone moments" often involve combining pre-existing technologies in a novel way. The Tesla Model S, ChatGPT, Netflix, social media and search might be other examples.
We’ll just have to keep watching.
No comments:
Post a Comment