Sunday, April 16, 2023

We Will Overestimate what Generative AI can Accomplish Near Term

For most people, it seems as though artificial intelligence has suddenly emerged as an idea and set of possibilities. Consider the explosion of interest in large language models or generative AI.


In truth, AI has been gestating for many many decades. And forms of AI already are used in consumer applicances such as smart speakers, recommendation engines and search functions.


What seems to be happening now is some inflection point in adoption. But the next thing to happen is that people will vastly overestimate the degree of change over the near term, as large language models get adopted, just as they overestimate what will happen longer term.


That is an old--but apt--story.


“Most people overestimate what they can achieve in a year and underestimate what they can achieve in ten years” is a quote whose provenance is unknown, though some attribute it to Standord computer scientist Roy Amara. Some people call it the “Gate’s Law.”


The principle is useful for technology market forecasters, as it seems to illustrate other theorems including the S curve of product adoption. The expectation for virtually all technology forecasts is that actual adoption tends to resemble an S curve, with slow adoption at first, then eventually rapid adoption by users and finally market saturation.   


That sigmoid curve describes product life cycles, suggests how business strategy changes depending on where on any single S curve a product happens to be, and has implications for innovation and start-up strategy as well. 


source: Semantic Scholar 


Some say S curves explain overall market development, customer adoption, product usage by individual customers, sales productivity, developer productivity and sometimes investor interest. It often is used to describe adoption rates of new services and technologies, including the notion of non-linear change rates and inflection points in the adoption of consumer products and technologies.


In mathematics, the S curve is a sigmoid function. It is the basis for the Gompertz function which can be used to predict new technology adoption and is related to the Bass Model.


Another key observation is that some products or technologies can take decades to reach mass adoption.


It also can take decades before a successful innovation actually reaches commercialization. The next big thing will have first been talked about roughly 30 years ago, says technologist Greg Satell. IBM coined the term machine learning in 1959, for example, and machine learning is only now in use. 


Many times, reaping the full benefits of a major new technology can take 20 to 30 years. Alexander Fleming discovered penicillin in 1928, it didn’t arrive on the market until 1945, nearly 20 years later.


Electricity did not have a measurable impact on the economy until the early 1920s, 40 years after Edison’s plant, it can be argued.


It wasn’t until the late 1990’s, or about 30 years after 1968, that computers had a measurable effect on the US economy, many would note.



source: Wikipedia


The S curve is related to the product life cycle, as well. 


Another key principle is that successive product S curves are the pattern. A firm or an industry has to begin work on the next generation of products while existing products are still near peak levels. 


source: Strategic Thinker


There are other useful predictions one can make when using S curves. Suppliers in new markets often want to know “when” an innovation will “cross the chasm” and be adopted by the mass market. The S curve helps there as well. 


Innovations reach an adoption inflection point at around 10 percent. For those of you familiar with the notion of “crossing the chasm,” the inflection point happens when “early adopters” drive the market. The chasm is crossed at perhaps 15 percent of persons, according to technology theorist Geoffrey Moore.

source 


For most consumer technology products, the chasm gets crossed at about 10 percent household adoption. Professor Geoffrey Moore does not use a household definition, but focuses on individuals. 

source: Medium


And that is why the saying “most people overestimate what they can achieve in a year and underestimate what they can achieve in ten years” is so relevant for technology products. Linear demand is not the pattern. 


One has to assume some form of exponential or non-linear growth. And we tend to underestimate the gestation time required for some innovations, such as machine learning or artificial intelligence. 


Other processes, such as computing power, bandwidth prices or end user bandwidth consumption, are more linear. But the impact of those linear functions also tends to be non-linear. 


Each deployed use case, capability or function creates a greater surface for additional innovations. Futurist Ray Kurzweil called this the law of accelerating returns. Rates of change are not linear because positive feedback loops exist.


source: Ray Kurzweil  


Each innovation leads to further innovations and the cumulative effect is exponential. 


Think about ecosystems and network effects. Each new applied innovation becomes a new participant in an ecosystem. And as the number of participants grows, so do the possible interconnections between the discrete nodes.  

source: Linked Stars Blog 


Think of that as analogous to the way people can use one particular innovation to create another adjacent innovation. When A exists, then B can be created. When A and B exist, then C and D and E and F are possible, as existing things become the basis for creating yet other new things. 


So we often find that progress is slower than we expect, at first. But later, change seems much faster. And that is because non-linear change is the norm for technology products.


Saturday, April 15, 2023

5G Leaky Bucket Problems

What happens with legacy services is arguably more important, near term, than what happens with new services created by 5G networks. The reasons are obvious: the new services represent smallish revenues while the legacy services represent most of the total revenue.


Small percentage declines in core legacy services have more revenue and profit margin impact than all the new services put together. The image of a hamster running on a wheel might not be appetizing, but that is the situation connectivity providers face.


Or, if you like, a leaky water bucket where new water is poured into the bucket as water continues to leak from holes.


Most connectivity service providers serving well-served and nearly-saturated mass markets would be happy if annual revenue growth chugged along at about a two-percent rate. Service providers in some markets can expect higher growth rates, but the global average will probably be in the two-percent range. 


Given some deterioration in legacy lines of business (negative growth rates), growth rates in one or more new areas might have to happen at higher-than-two-percent rates to maintain an overall growth rate of two percent. 


And that is the problem for new 5G services in the edge computing, private networks or internet of things areas, for example. The new revenue streams will be small in magnitude, while even a modest decline in a legacy service can--because of the larger size of the existing revenue streams--can pose big problems. 


Many service providers, for example, expect big opportunities in business services, which underpins hopes for private networks, edge computing and IoT. But revenue magnitudes matter. 


Consumer revenue always drives the bulk of mobile operator service revenues. And revenue growth is the key issue.  


But it will be hard for new 5G services for enterprises and business to move the revenue needle. 


Edge computing possibly can grow to generate a minimum of $1 billion in annual new revenues for some tier-one service providers. The same might be said for service-provider-delivered and operated  private networks, internet of things services or virtual private networks. 


But none of those services seem capable of driving the next big wave of revenue growth for connectivity providers, as their total revenue contribution does not seem capable of driving 80 percent of total revenue growth or representing half of the total installed base of revenue. 


In other words, it does not appear that edge computing, IoT, private networks or network slicing can rival the revenue magnitude of voice, texting, video subscriptions, home broadband or mobile subscription revenue. 


It is not clear whether any of those new revenue streams will be as important as MPLS or SD-WAN, dedicated internet access or Ethernet transport services, for example. All of those can be created by enterprises directly, on a do-it-yourself basis, from the network edge. 


source: STL, KBV Research 


In the forecast shown above, for example, services includes system integration and consulting, certain to be a bigger revenue opportunity than new sales of connectivity services. 


And though it might seem far fetched, the lead service sold by at least some connectivity providers might not yet have been invented.  


At least so far, 5G fixed wireless is the only new 5G service that is meaningful and material as a revenue source for at least some mobile operators. Even if network slicing, edge computing, private networks and sensor network support generate some incremental revenues, the volume of incremental revenue will not be as large as many hope to gain.


It is conceivable that mobile operators globally will make more money providing home broadband using fixed wireless than they will earn from the flashier, trendy new revenue sources such as private networks, edge computing and internet of things. 

source: Ericsson 


Wells Fargo telecom and media analysts Eric Luebchow and Steven Cahall predict fixed wireless access will grow from 7.1 million total subscribers at the end of 2021 to 17.6 million in 2027, growth that largely will come at the expense of cable operators. 

source: Polaris Market Research 

If 5G fixed wireless accounts and revenue grow as fast as some envision, $14 billion to $24 billion in fixed wireless home broadband revenue would be created in 2025. 


The point is that the actual amount of new revenue mobile service providers can earn from new services sold to enterprises is more limited than many suspect.

Unknown "Homes Passed" Data Hampers Revenue Growth Estimates

Some important types of statistics and data are not collected because governments do not force firms or industries to collect it. For example, many governments think it is important to track data on where home broadband exists, where it does not, how fast it operates, who buys and who does not. 


Private firms often have important incentives to track and measure their own revenues, sales, profit margins and growth rates. Financial markets and accounting rules often require measurement of this sort. 


AT&T, for example, reports revenues for mobility, fixed network business revenues and consumer fixed network revenues from internet access, voice and other sources. But those are traditional financial metrics, not operating indices such as penetration or take rates, churn rates and new account gains. 

source: AT&T 


Nobody seemingly believes the same effort should be made to measure the number of home broadband provider locations or dwellings reached by various networks. Better mapping, yes. Metrics on locations passed? No. 


And yet “locations passed” is a basic and essential input to accurately determine take rates (percent of potential customers who actually buy). That input matters quite a lot to observers when evaluating the growth prospects of competitors, even if that figure does not matter much for policymakers, who mainly care about the total degree of home broadband take rates, on an aggregate basis. 


The U.S. Census Bureau, for example, reported some 140.5 million housing units housing units as part of the 2020 census. The estimate for 2021 units is 142.2 million units. Assume 1.5 million additional units added each year, for a 2022 total of about 143.6 million dwelling units


Assume vacancy rates of about six percent. That implies about 8.6 million unoccupied units that would not be assumed to be candidates for active home broadband subscriptions. The U.S. Census Bureau, though, estimates there are about 11 million unoccupied units when looking at full-time occupied status. That figure presumably includes vacation homes.


Deducting the unoccupied dwellings gives us a potential home broadband buyer base of about 132.6 million locations. 


More difficult is the degree to which access networks operated by any single contestant actually pass those locations, as firms generally do not report such numbers in quarterly financial or annual reports (they do not have to do so). 


And that is where estimations must be made. AT&T’s 2022 10-K report cites 14.2 million customer locations connected. Assume AT&T has about 20 percent take rates for its home broadband services where it operates. That implies a housing unit coverage of about 71 million dwellings. 


Assume AT&T has a higher take rate of about 39 percent where it operates fixed networks. That implies housing coverage of about 36 million dwellings. 


The estimate of 71 million home passings strikes me as too high, but the estimate of 36 million seems too low. In the past I have used the figure of 62 million homes passed for AT&T. 


Assume Verizon has about 10 million home broadband accounts, with a take rate of 40 percent (a bit high, probably, if we include copper access). That implies housing coverage of some 25.3 million dwellings. 


Leichtman Research Group has estimates of home broadband accounts that vary from company reports. LRG estimates that AT&T has some 15.4 million internet access accounts. The variance might come from business accounts not enumerated. 


Verizon’s consumer accounts might be overstated, as LRG estimates Verizon has about 7.5 million home broadband accounts, not 10 million. Using the LRG account figures, we might estimate Verizon home coverage of about 18.8 million homes, on the high side. 


ISPs

Subscribers at end of 2022

Net Adds in 2022


Cable Companies



Comcast

32,151,000

250,000

Charter

30,433,000

344,000

Cox*

5,560,000

30,000

Altice

4,282,900

(103,300)

Mediacom*

1,468,000

5,000

Cable One**

1,060,400

14,400

Breezeline**

693,781

(22,997)


Total Top Cable

75,649,081

517,103


Wireline Phone Companies



AT&T

15,386,000

(118,000)

Verizon

7,484,000

119,000

Lumen^

3,037,000

(253,000)

Frontier

2,839,000

40,000

Windstream*

1,175,000

10,300

TDS

510,000

19,700

Consolidated**

367,458

724


Total Top Wireline Phone

30,798,458

(181,276)


Fixed Wireless Services



T-Mobile

2,646,000

2,000,000

Verizon

1,452,000

1,171,000


Total Top Fixed Wireless

4,098,000

3,171,000


Total Top Broadband

110,545,539

3,506,827

source: Leichtman Research Group 


Assume Comcast has 31.2 million accounts, with take rates for home broadband of about 52 percent. That implies something on the order of 60 million households. 


Assume Charter Communications has a take rate of about 45.5 percent where it operates fixed networks. Assume Charter has approximately 30.8 million home broadband accounts. That implies a homes-passed figure of about 67.7 million homes. 


If there are 132.6 million U.S. occupied home locations, then Comcast and Charter can reach about 127.7 million of those locations, or about 96 percent of total, as Comcast and Charter essentially have unduplicated networks, not competing in the same geographies. 


That strikes me as unlikely, on the high side. An older rule of thumb is that Comcast and Charter reach about a third of total U.S. locations, each, for a possible reach of up to 66 percent of total U.S. home locations. 


Using different methodologies, I have in the past estimated that Comcast has (can actually sell service to ) about 57 million homes passed, while the Charter Communications network passes about 50 million homes, the number of potential customer locations it can sell to.


Verizon homes passed might number 18.6 to 20 million. To be generous, use the 20 million figure. 


AT&T’s fixed network represents perhaps 62 million U.S. homes passed. CenturyLink never reports its homes passed figures, but likely has 20-million or so consumer locations it can market services to. 


Ignoring the variance in potential customer locations passed, AT&T would seem to have the greatest opportunity in the home broadband space, if it can build optical access connections faster, as has the biggest home footprint and low home broadband market share. 


On the other hand, AT&T revenue is driven by mobility, not the consumer fixed network. So then the question has to be posed as "how much to invest in the consumer fixed network?" compared to other oportunities. A rational person might argue that answer is "not so much."


Capital availability--and financial returns--are always the issue. Even if it dramatically escalated fiber-to-home capital investment, it is not clear AT&T would gain as much new revenue, compared to investing in mobility or business services, for example.


The point of the wider exercise is that we are forced to guess about how many homes each of the major fixed network contestants actually can reach. That, in turn, affects our ability to estimate adoption rates and potential growth opportunities. 


The key point is that the estimates are imprecise. Pinning down the “homes passed” figure, essential as the denominator in any calculation of take rates, requires estimations with variable degrees of uncertainty, especially for the larger networks.


Product and Customer "Who, What, Where, When, Why" is Evolving

What you sell matters. How you sell it also often matters. Who you sell it to, and where you sell it, also matters. How much you sell always matters. What it costs you to sell those items also always matters.


The rise of digital infrastructure partly raises the issue of how best to organize the production and sale of internet access and other connectivity products, as cloud computing has changed the way we think about how to procure and supply computing and applications.


To a greater extent than ever, asset owners and analysts evaluate the merits of asset-light or asset-lighter approaches. That changes the answers to the questions of what, who, where, when, why products get sold, as well as how much and how profitably.


Access providers tend to have operating margin valuation multiples as much as three times lower than infrastructure-only providers such as tower companies. 


Several aspects seem to account for the disparities. Tower companies sell to all competitors in a market, and therefore are viewed as representing less risk, as the tower companies can theoretically address nearly 100 percent of the market.


No single retail telco or internet service provider ever can claim to acquire as customers more than a fraction of the total market. Additionally, tower companies sell multi-year contracts, often with price escalator clauses to protect against inflation. 


That offers the sort of cash flow predictability that investors value in utility type businesses ranging from electrical and natural gas retailers to airports and toll roads. Also, cell tower assets offer some protection against unrestrained new competition. 


source: Deloitte 


Data center assets also are viewed as having similar characteristics, though perhaps with less moat protection, as, in principle, additional data centers can be built at the same locations. 


Still, there are but a handful of hyperscalers who are potential data center tenants, so there are some moats in that regard. But the total range of enterprise and business tenants is far broader. 


While additional cell towers can be built at similar locations, the number of potential tenants is more limited, as there might be only a handful of potential tenants. 


For such reasons, data center assets might show a broader range of valuations, but still be much higher than EBITDA valuations for access providers. 


source: Oliver Wyman

Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...