Showing posts sorted by date for query S curve. Sort by relevance Show all posts
Showing posts sorted by date for query S curve. Sort by relevance Show all posts

Thursday, June 1, 2023

AI Will Bring Less Change Near Term Than We Think

There are good reasons why generative AI will get commercial traction faster than AR, VR or XR: cost, ease of use and scalability. 


Broadly speaking, the cost to create a commercial use case, at scale, is far easier with generative AI. 


Generative AI is software-based, and can be used with virtually any existing application, to add content creation; support or code-writing tasks to any existing app. That means the time to deploy and cost to deploy--while far from insignificant--can rely on existing app use cases and deployed instances. 


Any form of “Metaverse,” AR, VR or XR apps require new specialized hardware, generally are not “mobility enabled” and also require creation of new apps and ecosystems. That takes time and money. 


So generative AI is easier to create and deploy and easier to use. It requires no new hardware; no new behavioral changes; no new applications. It simply adds features to what already exists. 


Since generative AI is essentially a “bolt on” for existing use cases and apps, it can scale quickly. 


Still, some patience will be required, as at-scale commercial use cases will develop more slowly than most expect, even if AI scales faster than XR, VR or AR and metaverse, for example, though interest in metaverse will return eventually.


I learned early in my career making forecasts that it is better to conservative in the early going. Humans nearly always tend to overestimate the near-term impact of any technology and underestimate the long-term impact. 


“We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run” is one way of stating the principle. So is “We always overestimate the change that will occur in the short term and underestimate the change that will occur in the long term.”


Or, “People overestimate what can be done in one year, and underestimate what can be done in ten.” All three statements capture the wisdom of how significant new technologies create change. 


There is a bit of business wisdom that argues we overestimate what can be done near term, but underestimate the long term impact of important technologies or trends. The reason is that so many trends are an S curve or Sigmoid function


Complex system learning curves are especially likely to be characterized by the sigmoid function, since complex systems require that many different processes, actions, habits,  infrastructure and incentives be aligned before an innovation can provide clear benefit. 

source: Rocrastination 


Also, keep in mind that perhaps 70 percent of change efforts fail, the Journal of Change Management has estimated. We might then modify our rules of thumb further, along the lines of “even as 70 percent of innovations fail, we will see less change than we expect in one year and more change than we expect in 10 years.” 


At least in part, technological impact increases over time for reasons of diffusion (what percentage of people use the technology regularly) as well as enculturation (it takes time for people and organizations to figure out how to best use a new technology). 


Impact arguably also increases as the ecosystem grows more powerful, allowing many more things to be done with the core technology.


Sunday, April 16, 2023

We Will Overestimate what Generative AI can Accomplish Near Term

For most people, it seems as though artificial intelligence has suddenly emerged as an idea and set of possibilities. Consider the explosion of interest in large language models or generative AI.


In truth, AI has been gestating for many many decades. And forms of AI already are used in consumer applicances such as smart speakers, recommendation engines and search functions.


What seems to be happening now is some inflection point in adoption. But the next thing to happen is that people will vastly overestimate the degree of change over the near term, as large language models get adopted, just as they overestimate what will happen longer term.


That is an old--but apt--story.


“Most people overestimate what they can achieve in a year and underestimate what they can achieve in ten years” is a quote whose provenance is unknown, though some attribute it to Standord computer scientist Roy Amara. Some people call it the “Gate’s Law.”


The principle is useful for technology market forecasters, as it seems to illustrate other theorems including the S curve of product adoption. The expectation for virtually all technology forecasts is that actual adoption tends to resemble an S curve, with slow adoption at first, then eventually rapid adoption by users and finally market saturation.   


That sigmoid curve describes product life cycles, suggests how business strategy changes depending on where on any single S curve a product happens to be, and has implications for innovation and start-up strategy as well. 


source: Semantic Scholar 


Some say S curves explain overall market development, customer adoption, product usage by individual customers, sales productivity, developer productivity and sometimes investor interest. It often is used to describe adoption rates of new services and technologies, including the notion of non-linear change rates and inflection points in the adoption of consumer products and technologies.


In mathematics, the S curve is a sigmoid function. It is the basis for the Gompertz function which can be used to predict new technology adoption and is related to the Bass Model.


Another key observation is that some products or technologies can take decades to reach mass adoption.


It also can take decades before a successful innovation actually reaches commercialization. The next big thing will have first been talked about roughly 30 years ago, says technologist Greg Satell. IBM coined the term machine learning in 1959, for example, and machine learning is only now in use. 


Many times, reaping the full benefits of a major new technology can take 20 to 30 years. Alexander Fleming discovered penicillin in 1928, it didn’t arrive on the market until 1945, nearly 20 years later.


Electricity did not have a measurable impact on the economy until the early 1920s, 40 years after Edison’s plant, it can be argued.


It wasn’t until the late 1990’s, or about 30 years after 1968, that computers had a measurable effect on the US economy, many would note.



source: Wikipedia


The S curve is related to the product life cycle, as well. 


Another key principle is that successive product S curves are the pattern. A firm or an industry has to begin work on the next generation of products while existing products are still near peak levels. 


source: Strategic Thinker


There are other useful predictions one can make when using S curves. Suppliers in new markets often want to know “when” an innovation will “cross the chasm” and be adopted by the mass market. The S curve helps there as well. 


Innovations reach an adoption inflection point at around 10 percent. For those of you familiar with the notion of “crossing the chasm,” the inflection point happens when “early adopters” drive the market. The chasm is crossed at perhaps 15 percent of persons, according to technology theorist Geoffrey Moore.

source 


For most consumer technology products, the chasm gets crossed at about 10 percent household adoption. Professor Geoffrey Moore does not use a household definition, but focuses on individuals. 

source: Medium


And that is why the saying “most people overestimate what they can achieve in a year and underestimate what they can achieve in ten years” is so relevant for technology products. Linear demand is not the pattern. 


One has to assume some form of exponential or non-linear growth. And we tend to underestimate the gestation time required for some innovations, such as machine learning or artificial intelligence. 


Other processes, such as computing power, bandwidth prices or end user bandwidth consumption, are more linear. But the impact of those linear functions also tends to be non-linear. 


Each deployed use case, capability or function creates a greater surface for additional innovations. Futurist Ray Kurzweil called this the law of accelerating returns. Rates of change are not linear because positive feedback loops exist.


source: Ray Kurzweil  


Each innovation leads to further innovations and the cumulative effect is exponential. 


Think about ecosystems and network effects. Each new applied innovation becomes a new participant in an ecosystem. And as the number of participants grows, so do the possible interconnections between the discrete nodes.  

source: Linked Stars Blog 


Think of that as analogous to the way people can use one particular innovation to create another adjacent innovation. When A exists, then B can be created. When A and B exist, then C and D and E and F are possible, as existing things become the basis for creating yet other new things. 


So we often find that progress is slower than we expect, at first. But later, change seems much faster. And that is because non-linear change is the norm for technology products.


Tuesday, March 7, 2023

"Down and to the Right" Versus "Up and to the Right"

Down and to the right is a reasonable depiction of internet service provider, capacity provider, mobility or telco legacy revenue per unit trends of the past several decades. That applies to wide area network capacity, internet transit prices, voice prices, long distance calling prices, text messaging rates, or mobile network minutes of use or data usage charges. 


That is distinct from the “up and to the right” depiction of the fortunes of growth industries, firms and products. At various points, internet access and home broadband, for example, have been “up and to the right” products. 


Which is another way of noting that business strategy is different for legacy and declining products compared to new and growing products. Think of “S” curve and its strategy implications. 


When a product is late in its life cycle, it is nearly pointless to invest too much, as no matter what one does, the product is destined to be replaced. So firms harvest revenues as long as they can. 


source: Strategic Thinker


The opposite has to be done for the newer replacement products: one has to invest in them. All that raises a question: is the move to try and monetize network functions using application programming interfaces (APIs) a move that helps connectivity providers extend the revenue production of declining products or propel the average revenue per unit of new products? 


To the extent it might represent both, how much does it create value, compared to how much it could destroy value? In other words, can monetizing API access to network features create big new revenue streams faster than it can commoditize the same?


In the former case, slowing the rate of revenue decline for some products might be described as “winning.” In the latter case, accelerating the rate of revenue growth constitutes winning. 


APIs might enable either outcome. If this all feels somehow reminiscent, think of the adoption of TCP/IP by global service providers as the “network of the future.” IP created layers of functions that are connected by APIs. 


So the network of the future necessarily separates application creation and ownership from network transport and access functions. 


Has that helped or hurt? And whom has it helped; whom has it helped?


AI Will Improve Productivity, But That is Not the Biggest Possible Change

Many would note that the internet impact on content media has been profound, boosting social and online media at the expense of linear form...