Friday, August 2, 2024

Many Consumers Will Always Buy "Good Enough Value" Home Broadband

Some question the long-term viability of 5G fixed wireless services, arguing that, eventually, it will prove unable to compete with ever-higher capacities supplied by cabled networks, especially fiber to home platforms. 


Supporters might make the case that “eventually” is the key phrase, as the market potential for fixed wireless between “today” and “tomorrow” is likely to be quite extended. At the moment, perhaps 51 percent or 52 percent of all U.S. homes or dwelling units have service available from at least one provider. 


By 2030 that percentage might increase to 76 percent to 80 percent. 


At the moment, perhaps 10 percent to 15 percent of U.S. homes have FTTH service available from at least two providers, growing to possibly 30 percent to 40 percent by 2030. 


For starters, FTTH is expensive enough that no single service provider can afford to build new networks ubiquitously, even if the customer demand is present. By some estimates, the cost to pass one urban home might be just $1,000, but the cost to pass suburban locations might range up to $3200, while rural passings can easily cost $7,000 or more. 


Area Type

Density

Estimated Cost per Home/Passing

Metropolitan

High

$1,000

Suburb (Flat Terrain)

Medium

$2,700

Suburb (Hilly Terrain)

Medium

$3,240

Rural (Flat Terrain)

Low

$6,300

Rural (Hilly)

Low

$7,000


And that is construction cost only, not including the cost to activate an account, which can add costs between $300 to $500 for each install. 


An equally-important issue is the take rate for such networks. It has been common for any new FTTH provider that is a telco to get up to 40 percent take rates over a few years, with initial uptake in the 20-percent range, often. Independent ISPs competing with both cable operators and a telco might expect take rates not exceeding 20 percent (where the cable operator can offer gigabit service and the telco does not offer FTTH). 


So the longer-term issue is how big the market might be for wireless service offering speeds in the lower ranges (100 Mbps to 200 Mbps now; undoubtedly higher speeds in the future), as more fiber access is available. To the extent that fixed wireless is taking market share from cable operators (perhaps even operators able to sell gigabit-per-second connections), we can infer that a substantial portion of the market is happy to pay the prevailing rates for access at such speeds, especially when able to bundle home broadband with their mobile access services. 


When comparing fixed wireless to either cable modem or FTTH service, many consumers might not be especially interested in services operating the 500-Mbps and faster ranges, much less gigabit ranges, when the slower speeds cost less. 


But demand will continue to shift over time, with most consumers eventually buying services operating faster than 200 Mbps, and in many instances much faster than 200 Mbps (gigabit to multi-gigabit ranges, for example). To be sure, fixed wireless providers are likely to find ways to increase their speed tiers as well, beyond 200 Mbps in the future, even if virtually all observers suggest wireless will continue to lag cabled networks in terms of speed. 


Speed Tier Take Rates, in Percentage

2023

2030

2040

Less than 100 Mbps

20-30

5-10

1-2

100 Mbps to 200 Mbps

30-40

10-20

5-10

Faster than 200 Mbps

30-40

70-80

85-90


Perhaps the best analogy is what cable operators have been able to do with their hybrid fiber coax networks, boosting speeds over time. 


Keep in mind that cable networks and FTTH networks back around 2000 were only offering top speeds in the 10-Mbps range. Fixed wireless networks also will be able to increase speeds over time, if never on the scale of cabled networks. 


Year

Typical Cable Operator Maximum Speed

1996

1.5 Mbps

Early 2000s

10 Mbps

Late 2000s

50 Mbps

2010

100 Mbps

2015

300 Mbps

2016

1 Gbps

2024

2 Gbps


But absolute ability to match cabled network speeds is not the question. The issue is what percentage of customers will, in the future, be willing to buy fixed wireless home broadband, at then-prevailing speeds, prices and offers. 


High-Cost Home Broadband Subsidies Work

Very few major social problems have clear and uncomplicated causal relationships, which makes virtually impossible the task of determining whether public policies actually work, or not. 


For complex social problems like poverty, housing, crime, education, carbon reduction or traffic, it remains quite difficult to prove causal links between policies and outcomes. Basically, policies are tried without any real way of knowing whether they work. 


Contrast that with a few instances where the primary causation mechanisms are relatively clear. The causal link between smoking tobacco and various health issues like lung cancer, heart disease, and respiratory problems is well-established.


There is a direct causal relationship between alcohol consumption and impaired driving leading to accidents and fatalities.


Conditions such as scurvy (vitamin C deficiency) or rickets (vitamin D deficiency) have clear causation mechanisms related to lack of specific nutrients in the diet. Likewise, the danger of lead exposure, especially in children, is clear.


The overuse and misuse of antibiotics in healthcare and agriculture has a direct causal relationship with the development of antibiotic-resistant bacteria.


Home broadband supply now is likely one problem for which we know at least one causal relationship, namely that financial subsidies work. Since the cost of home broadband infrastructure is directly related to population density, financial subsidies are required in low-density rural areas. 


Urban households tend to have access to better home broadband than households in rural areas, rural residents might note, policymakers might agree and OpenSignal says. 


And though there are correlations between income, education and age in any market, “income levels are less predictive of reliability than density,” OpenSignal notes.


It might be noteworthy that although sharing of network infrastructure often is touted as a way of reducing the cost of home broadband infrastructure, OpenSignal studies find there is no correlation between network infrastructure sharing and either high reliability or a narrow digital divide. “Countries with limited infrastructure sharing but targeted subsidies for private rural investment mostly perform better than those relying on widespread infrastructure sharing,” OpenSignal notes.


Topography and density are key factors in the size of the divide between urban and rural home broadband experience. 


Markets with highly-concentrated populations in urban areas show small gaps between urban and rural reliability, and spread-out middle-income countries with difficult terrain show big gaps. “But a few countries with lots of medium-density areas, like the U.S., and Spain, have relatively small digital divides,” the firm says. 


In fact, the U.S. market might better be characterized as having huge low-density areas. population density has a huge impact on the cost of building new networks, mobile or fixed, but especially fixed networks.  


U.S. population density is quite thin across most of its geography, which directly affects the cost of building broadband networks, as hefty subsidies are required to reach the last one percent or two percent of remote locations. 


And the United States has a huge percentage of its land mass that is thinly settled, if at all settled. In Canada, 14 percent of the people live in areas of density between five and 50 people per square kilometer. In Australia, 18 percent of people live in such rural areas.


In the United States, 37 percent of the population lives in rural areas with less than 50 people per square kilometer.


Put another way, less than two percent of Canadians and four percent of Australians live in such rural areas. In the United States, fully 48 percent of people live in such areas.


Put another way, about six percent of the U.S. land mass is “developed” and relatively highly populated. Those are the areas where it is easiest to build networks. But about 94 percent of the U.S. land surface  is unsettled or lightly populated, including mountains, rangeland, cropland and forests. And that is where networks are hardest to build and sustain.


So it should not at all be surprising that broadband reliability is, on average, 23 percent higher in urban areas than in rural areas across all markets we analyzed,” say analysts at OpenSignal. The firm uses a 100 to 1000 point scale to measure broadband experience in a typical household where multiple devices are used simultaneously. 


The metric is based on ability to connect (uptime); ability to complete tasks and speed, latency, jitter performance. 


source: OpenSignal 


Financial subsidies for service providers in rural areas are one way governments try to close digital divides, and arguably are the most effective ways to do so. Whether in the form of subsidies for anchor institutions or per-passing or per-connection support is the clearest way to reduce the cost of rural infrastructure for suppliers. 


Beyond that, policymakers often try to encourage competition and promote deployment of alternative platforms (satellite, fixed wireless, mobile access). 


Governments can help communities create cooperatives; reduce permitting and other regulatory costs or train people to use broadband. But perhaps nothing works so well as simple subsidies, for the simple reason that population density and network cost are inversely related. 


High population density leads to lower costs; low density leads to higher costs. So subsidies for home broadband in rural areas are a relatively clear example of cause-and-effect relationships. 


Who Wins AI Arms Race?

Perhaps Meta’s reported ad revenue growth, along with its planned AI investments, will provide some comfort to observers worried about the pace and payoff of such investments. 


Those developments, where the fundamentals of the legacy business are robust, while AI investments are reasonable enough to be supported to that revenue, arguably represent the best case so far for balancing AI capex and current earnings.


Where Meta now predicts full- year 2024 capital expenditures in the range of $37 billion to $40 billion, updated from its prior range of $35 billion to $40 billion, second-quarter revenues were up 22 percent ( 23 percent on a constant currency basis), year over year. 


“At the end of the day, we are in the fortunate position where the strong results that we're seeing in our core products and business give us the opportunity to make deep investments for the future,” said Mark Zuckerberg, Meta chairman and CEO. 


Which might also illustrate another key principle: so-called “AI stocks” might have to be evaluated based primarily on how their core legacy businesses are situated, irrespective of any future benefit from AI operations and products, since such benefits will not be obvious for some time. 


“While we expect the returns from Generative AI to come in over a longer period of time, we’re mapping these investments against the significant monetization opportunities that we expect to be unlocked,” Zuckerberg noted.


In the particular case of Meta, at least some analysts and observers will be heartened by the apparent recognition on Meta’s part that AI is the more-immediate opportunity, compared to augmented reality, for example. 


“A few years ago, I would have predicted that holographic AR would be possible before Smart AI, but now it looks like those technologies will actually be ready in the opposite order,” said Zuckerberg. 


Commenting on “the AI platform shift,” Similar to the cloud, this transition,  Satya Nadella, Microsoft Chairman and CEO, noted that AI is similar to the prior transition to cloud computing, involving  capital-intensive investments

In other words, investment has to be made. 

As was the case at Alphabet and Meta, revenue was up 15 percent, year over year in the second quarter of 2024. 

Al capex also is up at Microsoft, but “roughly half of FY2024's total capital expense as well as half of fourth-quarter expense, it's really on land and build and finance leases, and those things really will be monetized over 15 years and beyond,” said Amy Hood, Microsoft CFO. 

Over at Alphabet, second-quarter 2024 revenues were up 14 percent (15 percent) in constant currency, year over year. But AI capex is expected to hit $50 billion in 2024. 


Market watchers seem to see danger for Alphabet’s search revenue stream as rival AI suppliers seek to cut into Google’s search dominance, beyond the issue of AI capex magnitude. 


 “The risk of under-investing is dramatically greater than the risk of over-investing for us here, even in scenarios where it turns out that we are over investing,” said Sundar Pichai, Alphabet CEO. 


It also is worth noting that, because of regulatory scrutiny, it no longer is possible for Alphabet to “acquire” positions in markets or capabilities. Instead, it has to grow them organically. 


The implications of Meta’s positioning on AI capex might be about as good as it gets: robust core revenue drivers able to support AI capex. Alphabet’s ad growth was not as good as Meta’s in the second quarter, and beyond that there are the concerns of search market share dangers. 



Tuesday, July 30, 2024

How Much Damage Did CrowdStrike Outage Cost the Fortune 500 Firms?

 The CrowdStrike outage of July 2024 had a significant impact across various industries. By some estimates, the total estimated loss to Fortune 500 companies (excluding Microsoft) are estimated at $5.4 to $10 billion.

So it might be considered “worth a shot” for Delta to ponder legal action against CrowdStrike.  


Some believe Fortune 500 healthcare firms suffered $2 billion in losses from the outage. The financial sector might argue it lost around $1.15 billion. Others believe the losses (including hard-to-quantify “reputational damage”) across a range of industries could reach the higher $10 billion figure.  


Industry

Estimated Losses (USD Billion)

Healthcare

1.94

Banking & Finance

1.15

Transportation

1.02

Retail

0.87

Manufacturing

0.78

Government

0.64

Hospitality & Tourism

0.51

Media & Entertainment

0.42

Other Industries

2.68

Total

10.01


Monday, July 29, 2024

AI Capex Concerns are Legitimate, but Also Unrealistic

Concerns about the payback from AI capital investment by hyperscale cloud computing giants including Alphabet, Microsoft and Amazon already have been an issue for equity investors. The day before the Alphabet earnings call (July 22, 2024), the stock price was $183.60. The day after the call the price was $174.37. 


Alphabet lost about $113.28 billion in equity value the day after its July 23, 2024 earnings call, and the total change in equity value for the following week was approximately $145.64 billion.


Similar damage could occur to other hyperscalers in the “cloud computing as a service” space, if investors do not see material increases in revenue and also hear forecasts of continued high capex. 

source: Reuters, LSEG 


Some of us not in the financial analyst business might find such expectations unreasonable. 


In part, that is because expectations for providers of software services generally anticipate high profit margins and relatively quick payback from capex, compared to providers of other services with a more utility-like character.


Even within the cloud computing business, capex might be expected to breakeven in two to four years, but not produce a payback for three to five years. In other capital-intensive industries, breakeven periods routinely range from five to 15 years, with payback taking seven to 20 years. 


Industry

Expected Breakeven Period

Expected Payback Period

Software

1-3 years

2-4 years

Cloud Computing

2-4 years

3-5 years

Communications Networks

5-7 years

7-10 years

Airlines

7-10 years

10-15 years

Real Estate

5-10 years

10-20 years

Utility Industries

10-15 years

15-20 years


Of course, financial analysts get paid to predict quarterly to annual results. Enterprise CEOs are judged on annual performance. But analysts and researchers often work with longer time frames. 


So firms will be punished for what is seen as “excessive” AI capex. What might not be immediately clear is the strategic impact half a decade to 20 years out. And that is the balance the cloud computing hyperscalers must now strike: investing in a prudent manner now while avoiding the risks of underinvesting. 


If AI winds up becoming a general-purpose technology, investing and adoption laggards might suffer to some degree. The problem is that nobody now knows what levels of investment are “too little” and which might be “too much.” 


Cloud computing provider revenues from customers are going to be the real test. But expectations about the degree of financial return, and the magnitude of return, have been unrealistic from the start.


Like it or not, many important capex investments take quite some time to show payback. So expectations of near-term financial gain seem quite unreasonable.


Why Marginal Cost of Content Creation is Generative AI's Superpower

Virtually every observer might agree that artificial intelligence will automate laborious tasks and therefore increase process efficiency. A...