Showing posts sorted by date for query Pareto. Sort by relevance Show all posts
Showing posts sorted by date for query Pareto. Sort by relevance Show all posts

Friday, January 16, 2026

Which Future for Neoclouds: Rational Consolidation or Collapse?

Technology market structures tend to change as they age. Small upstart companies get acquired; bigger firms merge; a few dominant leaders emerge, taking a “winner takes most” structure. 


Any market researcher, studying any particular capital-intensive market, will tend to find something like a Pareto distribution often applies: up to 80 percent of results are produced by 20 percent of actors. Some might call that the rule of three


Market share structures in computing, connectivity and software tend to be fairly similar: leadership by three firms, corresponding to the rule of three


“A stable competitive market never has more than three significant competitors, the largest of which has no more than four times the market share of the smallest,” BCG founder Bruce Henderson said in 1976.  


Codified as the rule of three, the observations explains the stable competitive market structure that develops over time, in many industries


Others might call this winner take all economics. 


So a logical question is what happens in the high-performance computing market, including the market space for neocloud providers such as CoreWeave, Nebius and others changing their business models from bitcoin mining to focus on artificial intelligence model training and inference operations


Some might argue we are shifting from a focus on training capabilities and towards inference operations. It’s hard to argue with that observation, as models become routine apps used by businesses and consumers. 


So some might argue we could see less need for highest-performance compute capabilities of the sort neocloud providers offer. Others might argue more of the computational load will be handled by edge devices, and there is some truth to that position as well. 


But inference operation ubiquity does not necessarily mean less power; less powerful chips; fewer operations inside massive data center complexes; less physical real estate or water consumption. 


Although pre-training growth is slowing, and compute is shifting from training to inference, the compute demands from post-training scaling and test-time scaling, and increased usage suggest that the world likely needs a lot more AI-focused data centers, and the ramp from US$300 billion to US$400 billion in 2025 to roughly US$1 trillion in 2028 is directionally realistic, according to one Deloitte estimate. 


So the future might not include less need for high-performance computing facilities. 


On the other hand, what technology market has not evolved over time to patterns with just a handful of market leaders?


So if the independent neocloud provider market follows the historic pattern, market consolidation will happen, with a handful of major, scaled neocloud providers; traditional hyperscalers, plus a long tail of smaller, niche players.


Some argue the process has already begun


But there are other possibilities as well.  The neocloud provider market might not consolidate but instead collapse.


The "Big Three" hyperscalers possess massive scale, deep financial resources, and comprehensive service portfolios, allowing them to engage in price wars and continuously innovate at a pace the smaller players cannot match. So some would argue this creates immense and unsustainable pressure on the neoclouds' margins and ability to compete effectively in the long term.


Without a genuinely-unique value proposition or niche, the independent neocloud providers might struggle to retain customers who often prefer the security and breadth of services offered by the large providers.


The hyperscalers also will be better positioned to handle the likely higher regulatory costs, ability to attract talent and risk aversion of enterprise customers, as well. 


A collapse scenario might happen for at least some providers if customers abandon the neoclouds because of longevity fears. The danger cannot be dismissed. 


That happened around the turn of the century to many would-be capacity providers and competitive local exchange carriers. 


In the late 1990s, driven by the Telecommunications Act of 1996, which opened markets to competition, hundreds of new companies rushed to build wide-area optical fiber networks and local access facilities.  


This resulted in a vast oversupply of "dark fiber" (unused capacity), with estimates suggesting 85 percent to 95 percent of constructed fiber went unused after the bust. 


The industry and investors widely believed demand for bandwidth would grow indefinitely, leading to an investment frenzy based on the mentality of "if you build it, they will come". Actual demand and revenue growth, however, did not keep pace with the rapid network construction, creating an unsustainable business model for many.


CLECs and fiber providers were able to secure massive amounts of funding through debt and speculative equity offerings. When the broader stock market began to decline in 2000, this financing dried up, immediately pushing heavily leveraged companies into bankruptcy.


Hypercompetition and Price Wars: The presence of too many competitors in the same markets led to vicious price wars that drove down bandwidth prices (in some cases, by 60 percent per year), making it difficult for many new entrants to become profitable or even cover their costs.


In that case, rational merger activity did not drive the consolidation. Instead, the sectors mostly collapsed into bankruptcy. It’s impossible to tell, today, which of these outcomes develops. Over-investment, over-capacity and inadequate demand have happened with many earlier technologies, including railroads in the nineteenth century; the telecom and internet bubbles of the late 1990s and early 2000 era.


Tuesday, September 17, 2024

80/20 Rule for TV Ratings: Why Sports Matter

According to TV ratings firm Nielsen, in 2023, the National Football League accounted for 93 of the year’s 100 most-watched TV shows in the U.S. market.


That’s a good example of the 80/20 rule, formally known as a Pareto principle, which suggests that 80 percent of consequences come from 20 percent of causes. 


More than a third of U.S. broadcaster NBC's viewing time in 2023, for example,  was attributable to NFL games and related programming, according to analysts at MoffettNathanson. CBS was even more dependent, at 40 percent. And at Fox, NFL games and other "shoulder programming" accounted for 63 percent of the time viewers spent with the network.


Beyond the NFL, sports programming drives a disproportionate share of revenue. 


Broadcaster

Top 20% of Programs (Revenue)

Remaining 80% of Programs (Revenue)

ABC

Major sporting events (e.g., NFL, NBA Finals)

Other sports programming, general entertainment

NBC

NFL, Olympics, Premier League

Other sports programming, general entertainment

CBS

NFL, NCAA March Madness, PGA Championship

Other sports programming, general entertainment

FOX

NFL, NASCAR, MLB postseason

Other sports programming, general entertainment


More significantly, National Football League revenue and profit also showed a Pareto distribution. in all cases, a relatively small percentage of programming time (five to 10 percent) is responsible for a disproportionately large percentage of profits (25 to 40 percent).


Network

Program Time %

Profit %

ABC

5

25

NBC

7

30

CBS

8

35

Fox

10

40


And that pattern has been in place for a while. 


source: Sportico 



That heavy draw of NFL content also suggests some potential churn issues for linear or video streaming firms showing NFL content, namely the danger of annual cycles of churn, as football season ends. 


Of course, there are other sports happening at other times of year than the NFL, but those sports do not have the drawing power of the NFL. Of course, once every couple of years there is an Olympics telecast that is highly viewed, but that event does not provide an annual or season-long upsurge in viewing. 



Friday, August 18, 2023

Cell Network Physics

Physics plays quite a large role when designing and operating a mobile network, considering the effects of radio signal frequency, reach, data capacity, modulation technique, cell size and cell use patterns. 


Since the time of first-generation analog cellular services, the amount of spectrum devoted to mobile service has both increased and moved higher in frequency, as this chart illustrates. All of us are familiar with at least some elements of higher-frequency spectrum. 


The 800-MHz radio waves used to support 1G networks were pretty good at penetrating walls. By the time we get to 4G, using spectrum up around 2 GHz, signals have limited ability to get through walls. In the 5G era, using resources up to about 4 GHz to 6 GHz, signals are directional, and will not get through a plant leaf. 

source: Keysight Technologies 


So all sorts of signal processing has to be used to create multiple paths for those signals, to get around the line-of-sight propagation pattern. 


Another issue is that radio signals at higher frequencies, using power levels common for cell networks, will not travel as far as they do when launched at similar power levels but using lower frequencies. 


That means smaller cells must be used, and that accounts for the increasing importance of optical fiber distribution networks, a trend that will continue as we continue to move up in frequency for subsequent mobile generations. 


The other beneficial aspect of higher-frequency radio waves, however, is their capacity. Radio signal frequency and capacity are directly related: the higher the frequency, the greater the capacity. 


So both smaller cells and higher frequencies mean the amount of bandwidth a mobile network can supply will increase dramatically. Looking only at frequency, frequencies in the 24-GHz range have two orders of magnitude more physical capacity than 800-MHz signals. 


source: IEEE 


Modulation schemes add more capacity. Looking only at quadrature amplitude modulation, the number of bits we can encode per symbol increases capacity, all other things being equal (channel size, for example). 


source: Microwave Link 


Smaller cells also help, by intensifying the degree of frequency reuse possible in any geographic area. All other things remaining equal, shrinking cell radius by 50 percent quadruples the total number of cells. 


source: Slideshare 


But those are techniques used on mobile networks alone. These days, mobile data traffic also can be offloaded to fixed networks when users connect their mobile devices to Wi-Fi. In the 5G era, perhaps as much as 70 percent of mobile data traffic actually is offloaded to fixed networks in the form of Wi-Fi access. 

source: Spectrum Futures


The other interesting angle is that mobile network traffic shows a Pareto distribution. About 75 percent of total traffic occurs on just about 30 percent of cell locations. 

source: Spectrum Matters  


Wednesday, March 22, 2023

Practical Implications of Pareto, Rule of Three, Winner Take All

Any market researcher, studying any particular market, will tend to find something like a Pareto distribution often applies: up to 80 percent of results are produced by 20 percent of actors. Some might call that the rule of three


Market share structures in computing, connectivity and software tend to be fairly similar: leadership by three firms, corresponding to the rule of three


“A stable competitive market never has more than three significant competitors, the largest of which has no more than four times the market share of the smallest,” BCG founder Bruce Henderson said in 1976.  


Codified as the rule of three, the observations explains the stable competitive market structure that develops over time, in many industries


Others might call this winner take all economics.  


Consider market shares and installed base in the U.S. home broadband market (including small business accounts). Of a possible total installed base of 122 million locations, 90 percent of the installed base is held by 15 companies. 


Just two firms have 52 percent of the installed base of accounts. 


Broadband Providers

Subscribers at end of 2022

Net Adds in 2022

Cable Companies



Comcast

32,151,000

250,000

Charter

30,433,000

344,000

Cox*

5,560,000

30,000

Altice

4,282,900

-103,300

Mediacom*

1,468,000

5,000

Cable One**

1,060,400

14,400

Breezeline**

693,781

-22,997

Total Top Cable

75,649,081

517,103

Wireline Phone Companies



AT&T

15,386,000

-118,000

Verizon

7,484,000

119,000

Lumen^

3,037,000

-253,000

Frontier

2,839,000

40,000

Windstream*

1,175,000

10,300

TDS

510,000

19,700

Consolidated

367,458

724

Total Top Wireline Phone

30,798,458

-181,276

Fixed Wireless Services



T-Mobile

2,646,000

2,000,000

Verizon

1,452,000

1,171,000

Total Top Fixed Wireless

4,098,000

3,171,000

Total Top Broadband

110,545,539

3,506,827

source: Leichtman Research Group




The point is that when tracking market developments, the big broad trends are discernible from understanding the actions, strategies and results of a mere handful of firms. And while the full range of “big company” strategies, opportunities and actions can vary substantially from those of perhaps hundreds to thousands of small firms, the trends that move the needle financially typically can be gleaned from following just a relative handful of firms. 


In other words, the business “laws of motion” are dictated by a relative handful of actors, even in markets with thousands of contestants. 


That might seem unimportant. For market analysts, it is a foundational assumption.


"Lean Back" and "Lean Forward" Differences Might Always Condition VR or Metaverse Adoption

By now, it is hard to argue against the idea that the commercial adoption of “ metaverse ” and “ virtual reality ” for consumer media was in...