Saturday, June 22, 2024

Moore's Law Slowing is Counterbalanced by Other Developments

It is possible to argue that Moore’s Law (which suggests a doubling of transistor density about every 12 to 18 months, or about every two years in practice) has only slowed, and not stopped, since the mid-1960s. 


Time Period

Doubling Time (Years)

1965-1975

1

1975-2000

2

2000-2010

2-3

2010-Present

                          3+


On the other hand, consider the transistor densities we now have to “double.” As we push the boundaries of how closely together transistors and pathways can be spaced, it becomes more difficult to manufacture the chips. 


Year

Processor Model

Transistor Count

Doubling Time (Years)

1971

Intel 4004

2,300

-

1974

Intel 8080

6,000

3

1978

Intel 8086

29,000

4

1982

Intel 80286

134,000

4

1985

Intel 80386

275,000

3

1989

Intel 80486

1,200,000

4

1993

Pentium

3,100,000

4

1997

Pentium II

7,500,000

4

1999

Pentium III

9,500,000

2

2000

Pentium 4

42,000,000

1

2006

Core 2 Duo

291,000,000

6

2010

Core i7

1,170,000,000

4

2014

Core i7 (Haswell)

1,400,000,000

4

2018

Core i9

2,000,000,000

4

2022

Apple M1

16,000,000,000

4


Similar slowing can be seen in accelerator and graphics processing chips. 


Year

GPU Model

Transistor Count

Performance Improvement

Doubling Time (Years)

2012

Kepler (GK110)

7.1 billion

Baseline

-

2016

Pascal (GP100)

15.3 billion

2x

4

2018

Turing (TU102)

18.6 billion

1.2x

2

2020

Ampere (GA102)

28.3 billion

1.5x

2

2024

Blackwell (B200)

208 billion

30x

4


The other issue is that transistor counts are not the only important variables. Parallel processing is an architectural shift that prioritizes throughput over raw clock speed.


Accelerator chips are designed for specific tasks like AI or video processing and their task-specific metrics arguably are more important than simple clock speed.


Heterogeneous computing combines CPUs, GPUs, and accelerators for optimal performance across different workloads, meaning overall system performance is more relevant than individual component speeds.


AI Monetization? Look at 5G

Buyers of infrastructure and services to use artificial intelligence might be forgiven their angst about payback or monetization of those investments. Sellers have few such qualms. 


Roughly the same argument happens around monetization of 5G services: executives complain that they have spent a lot on 5G and have perhaps not seen the financial returns they were expecting, in terms of new or higher revenues. 


That is not a new problem, and our experience with fiber-to-home and 5G provides instructive insight.


For some of us, the debate is an old one. In the mid-1990s, for example, it would not have been hard to find an argument about the payback from fiber-to-home networks, either. In the specific context of new competition between telcos and cable operators for voice, internet access and entertainment revenues, the argument was that FTTH would allow telcos to compete with cable in internet access and video, while cable operators took market share in voice. 


Then, as now, the issue was the new investments would enable assaults on various markets. Assuming a rough split of new internet access share, telcos expected to take share from cable in video services, while cable two-way networks took some telco voice share. 


Financial analysts and operating executives might have hoped for higher returns, but essentially the rationale came down to an existential argument: “do you want to remain in business or not?” Without FTTH upgrades, few, if any, telcos could expect to survive against competitors able to supply hundreds of megabits to gigabits per second home broadband speeds. 


That argument applies to 5G investments and clearly will apply to AI investments as well. Though many expect new revenues, use cases, products and services to be possible, the bottom line is that the new investments essentially allow firms to “remain in business.”


“You get to keep your business” might not be highly appealing, in one sense. One would rather be able to claim that investments will produce high financial returns. 


But that is not really the choice. The choice is “keep your business or go out of business.” The new investments in 4G and AI are essentially strategic and existential; not fully driven by traditional “return on investment” criteria. 


All that noted, some segments of each value chain will have an easier time showing results. As always with a new technology, the initial investments are required to enable use of the technology, and that often means infrastructure suppliers are first to benefit. 


If one agrees that the artificial intelligence market can be viewed as consisting of three layers of infrastructure; models and applications, as do analysts at UBS, value creation and supplier revenue also are in layers. As generally is the case for software layers, AI involves layers that also drive or dictate business, revenue and monetization models. 


The most-direct monetization will happen at the infrastructure layer, involving direct purchase of hardware, software and capabilities as a service. Nvidia and other creators of graphics processing units and acceleration hardware, as well as servers, are in this category. 


Monetization possibilities are mostly direct, in the form of licenses and subscriptions, at the model layer, with some possible indirect monetization for open source models. Subscriptions to use OpenAI; Copilot or Gemini are in this category. 


At the applications layer, monetization will mostly be indirect, in the form of improved existing products and services. UBS estimates “enabling” layer products and services including semiconductor production; chip design, cloud and data centers, and companies involved in power supply will generate at least $185 billion in 2027, with total segment revenues closer to $331 billion. 


Companies developing large language models and those that own data assets that can be turned into intelligence


Application layer: The companies which embed the tools from the intelligence layer into specific use cases. This layer likely offers the largest monetization potential over time, yet this opportunity is difficult to quantify at this early stage. Presently, the report expects a directly addressable market of USD 395 billion in revenue opportunities for the application layer by 2027.


In the 5G markets, one might note a similar trend. The clearest initial winners were the suppliers of 5G network infrastructure, such as Ericsson and Nokia; construction firms and so forth. 


Typically, it takes longer for application success to be discovered. 


In that regard, the salient example of direct new 5G revenue is fixed wireless for home broadband. Since about 2022, virtually all net account additions in the U.S. home broadband market have been supplied by fixed wireless platforms. 


source: CTIA 


Other gains attributable to 5G are mostly indirect or hard to quantify, since in most markets supporting 5G services, all the providers offer 5G. In some markets the quantities of various spectrum resources might provide an advantage to one or more providers, such as in the U.S. market, where T-Mobile’s greater trove of mid-band spectrum arguably has allowed it to take market share from the other leading providers. 


Still, over time, most of the value of 5G or AI, for most applications, use cases and users, is likely to be realized in more-subtle and indirect ways. 


Friday, June 21, 2024

CFOs Say They are Investing to Replace Jobs with Automation and AI

With the caveat that intentions often are not matched by actions or outcomes, a survey of chief financial officers (enterprise and smaller businesses) suggests a majority of CFOs have implemented automation and artificial intelligence to replace workers, and plan to keep at it for the next year. 


source: Federal Reserve Bank of Richmond


 

source: Federal Reserve Bank of Richmond


source: Federal Reserve Bank of Richmond


Those forecasts are mirrored by estimates of AI-caused job losses by research firms and analysts. 


Source

Timeframe

Estimate/Forecast

Goldman Sachs

Next decade

300 million jobs affected globally

Forrester

By 2030

2.4 million US jobs lost (1.5%)

Forrester

By 2030

11.08 million US jobs influenced (6.9%)

ResumeBuilder

2023

37% of businesses reported AI-induced layoffs

ResumeBuilder

2024

44% of businesses anticipate AI-induced layoffs

World Economic Forum

By 2025

85 million jobs displaced globally

World Economic Forum

By 2025

97 million new roles created globally

No "One Size Fits All" for Generative AI

There is no “one size fits all” generative artificial intelligence strategy. Instead, successful innovations will build on existing supplier...