Thursday, August 21, 2025

Younger Home Broadband Buyers are Less Loyal, Generate Lower ARPU

Sometimes researchers have to do studies that have an expected outcome, even if what the new research does is simply confirm an expected pattern. So it is with consumer home broadband loyalty and average revenue per account.


I don’t think anyone should be at all surprised by the recent findings of a study suggesting younger home broadband buyers spend less and churn more than older customers. In fact, that is the expected pattern for most subscription or repeat-purchase products. 


Age Group

Loyalty (Likelihood to Stay with One Brand/Service)

Average Spend

Gen Z (under ~25)

Low loyalty – experimenters, switch easily for price or novelty

Low spend – budget constraints, high trial behavior

Young Millennials (25–34)

Moderate loyalty – will stay if product aligns with values (sustainability, brand ethos)

Medium spend – starting careers, increasing discretionary income

Older Millennials / Early Gen X (35–49)

Higher loyalty – convenience and habit take hold

Higher spend – peak earning years, subscriptions accumulate

Older Gen X / Boomers (50–64)

High loyalty – less brand switching, seek reliability

High spend – stable income, value convenience and trust

Seniors (65+)

Very high loyalty – tend to stick with familiar brands

Variable spend – can be high (healthcare, legacy services) or constrained (fixed income)


Many studies of consumer home broadband churn behavior show the pattern. 


Study / Source (year)

Geography & sample

Findings – Churn / Switching by Age

Findings – Spend / Price by Age

Notes

Ofcom & Choose.co.uk (2019)

UK, broadband customers

Only 9% switched in last 12 months overall; >50% of customers aged 65+ have never switched (Choose)

Older customers less empowered to negotiate—62% confident to talk to provider, vs 87% under-65s; older more likely to be out of contract and overpay (Choose)

Clear age gap in switching behavior: older customers stick around and pay more.

Broadband.co.uk Switching Study (2025)

UK broadband users 18–24 vs older

Only 29% of 18–24 “like my provider” vs 45% of 65+; 37% of 18–24 had “no particular reason” to not switch vs much lower for older groups (Broadband Genie)

Indicates younger consumers are less satisfied, more inclined to question loyalty. Older show higher contentment.

Suggests younger users are less loyal, older are more settled.

Pew Research (US, 2017)

US adults by age group

Millennials (18–24) more likely to use mobile-only internet; 55% of 18–24 use smartphones only vs 69% of 55–64 prefer broadband (THE Journal)

Younger more mobile-dependent; older more likely to maintain home broadband subscriptions → spends more.

Non-UK data reinforcing that older rely more on broadband.

Reddit – subscription churn analysis

General subscription users (not broadband-specific)

“Customers between **18–24 y/o have the highest churn. Older age correlates to higher tenure.” (Reddit)

Implies older users tend to spend longer (hence likely higher cumulative spend).

While not broadband-specific, supports the general age-churn correlation.

Earth from 180 Million Miles Away

Picture of Earth and the moon from Psyche spacecraft at 180 million miles distance. Not something we've seen before. 

Wednesday, August 20, 2025

Does Apple Have to "Lead" in Language Models? Maybe Not

Some observers might argue Apple is “behind” in the artificial intelligence chatbot “race,” suggesting this is a problem. It might be, if Apple were really trying to claim leadership of the frontier language model business, and if Apple needed to do so. 


Other observers might note that Apple has not, as a company, been “first” with an innovation very often, instead emphasizing products whose key attribute is ease of use. In other words, the strategy is “make it better,” rather than “be first.” 


And although some of the hyperscalers act as though first mover advantage in general purpose language models does matter for future market leadership, it remains a historical fact that, in the computing industry, first movers (companies that pioneer a new product category or market) rarely retain market share leadership as the market matures. 


A study  by Golder and Tellis (2004), analyzing over 500 brands in 50 product categories (many tech-related), found that first movers had a 47 percent failure rate and an average market share of just 10 percent. 


Market/Category

First Mover

Key Details

Why They Lost Leadership

Current Leader(s)

Web Browsers

Netscape (Navigator, 1994)

Launched the first commercial web browser, capturing 75% market share initially.

Bundled competition (e.g., Microsoft's Internet Explorer) and failure to innovate led to decline; share fell to near zero by 2003.

Google (Chrome)

Social Networking

Friendster (2002) / MySpace (2003)

Friendster was the first modern social network; MySpace quickly overtook but peaked at 115M users.

Poor user experience, spam, and slower adaptation to mobile/privacy needs; both faded as market matured.

Meta (Facebook)

Search Engines

WebCrawler / AltaVista (1994-1995)

Early search engines indexing the web; AltaVista handled 500M queries/day at peak.

Inferior algorithms and ad-heavy interfaces; acquired and mismanaged as market exploded.

Google

Hard Drives

IBM (1950s)

Invented the first commercial hard disk drive (RAMAC, 1956).

Failed to scale for PCs/laptops; sold business to Hitachi in 2002 amid commoditization.

Seagate, Western Digital

Spreadsheets

VisiCalc (1979)

First electronic spreadsheet for personal computers.

Limited to Apple II; outcompeted by multi-platform alternatives like Lotus 1-2-3.

Microsoft (Excel)

PDAs / Smartphones

Palm (Pilot, 1996) / IBM (Simon, 1994)

IBM Simon was the first smartphone; Palm popularized PDAs.

Slow to integrate phones/internet; acquired and declined as touchscreens rose.

Apple (iPhone), Samsung/Google (Android)


The caveat might be that the general-purpose language model business will move so fast that the first mover will create and sustain permanent advantage, to the extent that can happen. 


In the computing or technology businesses, moving “early” seems to offer more long-term advantages, though. The Golder and Tellis study still suggests that early market leaders (not necessarily the absolute first) had an eight percent failure rate and 28 percent average share. 


And, in some cases, first movers do sustain market share leadership for quite some time. 


Market/Category

First Mover

Key Details

How They Retained Leadership

Current Status

Semiconductors (DRAM)

Samsung Electronics (early 1990s)

First to mass-produce 16M DRAM chips (1991); led in memory tech since 1992.

Fast parallel development, cross-functional teams, and aggressive R&D investment; shaped industry standards.

~40-50% global DRAM share; leader in mature market.

Microprocessors (x86)

Intel (1971)

Invented the first commercial microprocessor (4004); dominated PC/server CPUs.

Patented architecture, massive scale economies, and ecosystem lock-in (e.g., Windows compatibility).

~70-80% PC CPU share; still leads despite AMD/ARM competition.

Online Retail Platforms

Amazon (1995)

First major online bookstore; expanded to e-commerce platform.

Built logistics network, customer data advantages, and AWS cloud; preempted scale.

~40% U.S. e-commerce share; dominant in matured digital retail.


In markets characterized by rapid innovation and low barriers to entry, followers can learn from pioneers' mistakes, improve offerings, and capture share. But the general-purpose language model is quite capital intensive, creating significant entry barriers. 


Still, the Golder and Tellis study might suggest fairly high odds that early movers in the language model space will be among the mature market leaders. 


For Apple, the issue is whether it needs to be a leader in that market. For some of the leading contenders, their core revenue streams come from advertising, commerce, software or hosted computing services that might well be disrupted or enhanced if the firms also lead in language model share. 


Apple’s business is centered on devices. It might not need to lead in language models. It might only need to incorporate such features in its core products.


Tuesday, August 19, 2025

How Much Can AI Boost Economic Growth?

As always, assumptions are crucial when attempting to assess the impact artificial intelligence might have on economic growth.


According to electrical energy industry estimates, which we can probably assume are on the high side, the electric power industry underpins about five percent of U.S. gross domestic product. That might not be so helpful where it comes to understanding the potential contribution to boosting economic growth. Can


The largest GDP growth rate differences attributable to electrification occurred during the 1950s to 1960s, as manufacturing and industry were electricity intensive. During this period the U.S. saw high GDP growth (annual rates often four percent to five percent).


That doesn’t mean electricity accounted for most of that growth, but underpinned the growth. 


The widespread adoption of computers perhaps contributed 0.1 to 0.4 percentage points to the U.S. GDP growth per year during their major expansion phase (late 1980s–90s), according to the Bureau of Economic Analysis.  


Generative AI forecasts estimate a 0.4 percentage point annual boost to U.S. GDP growth over the coming decade as AI adoption spreads, according to a study

Estimated GDP Boost from Electricity, Computing, and AI GPTs

Technology

Estimated Percentage GDP Boost

Electricity

~5% of total U.S. GDP

Computing

0.1–0.4 percentage points per year to GDP growth

AI (Generative/Advanced)

0.4 percentage points per year to GDP growth (projected)

Electrical Infrastructure Upgrades for AI are Going into the Consumer Rate Base

It has been a couple of decades since the “rate base” was a key driver of revenues for U.S. telcos, in large part because the services affected by the rate base have declined so much (voice services). 


But the rate base is going to continue to affect consumer electricity prices for the foreseeable future, as additional power generation and transmission capacity is built to support higher demand for power to support data centers and artificial intelligence operations. 


In some states, such as Virginia, it is possible that rates could rise substantially, as much as 70 percent from current levels. California and Texas are additional states where price hikes could be higher.  


But those sorts of shocks are virtually certain to raise calls for reform of the rate base rules, as it is going to be said, with good reason, that consumers are subsidizing the operations of data center owners and operators. 


But that has been a rare approach. Past rate base reforms have rarely directly targeted large customer-driven infrastructure costs when allocating costs among customer classes. 


Traditionally, the costs of new infrastructure investments (generation, transmission, or distribution) have been spread across all ratepayers through general rates, regardless of which customer category caused the need for the investment.


Many observers and electricity customers will not be aware of such precedents, and are certain to be shocked by the growing cost of electricity. 


The rate base is the total value of a utility’s assets (power plants, transmission lines, and distribution infrastructure) used to provide service to customers, and forms the basis for setting the rates that utilities can charge their customers.


So the basic formula for setting consumer prices is the rate base times the allowed rate of return) plus operating expenses, with a return for investors as well, as a practical matter. 


Data centers matter because the common costs of new generation capacity and transmission can be charged, and will be charged, to all customers. 


If data centers currently account for four percent to five percent  of U.S. electricity consumption, but grow to 12 percent  by 2028, and current rate base rules do not change, that cost will be borne by all ratepayers. 


Sometimes "Protecting Legacy Industries" is What Citizens Want

Economic disruption always is a tough problem for regulators and policymakers who must balance "protecting" important legacy industries while still encouraging or at least tolerating the emergence of new industries that pose the threat of attacking those important legacy interests.


And, as a growing backlash against lodging sharing shows, consumers might themselves constitute a new political force hampering unrestrained lodging sharing operations.


The point is that "protecting legacy" industries sometimes accords with citizen and voter preferences, while promoting innovation might actually be opposed by those voters.


A European Union  document on the “collaborative economy” that underpins ride sharing, room sharing shows an effort to allow innovation that might harm the interests of established economic interests, at least in principle.


“The success of collaborative platforms are at times challenging for existing market operators and practices,” the EU policy document says, acknowledging the potential for economic damage to existing businesses. 


At the same time, there is growing resistance to sharing platforms in the housing market, as the platforms are viewed as restricting the availability of housing for local residents, as well as driving up prices. So in this case, the interests of innovating suppliers might run counter to the interests of residents and voters, as well as counter to the challenged legacy providers.


Significantly, the suggested EU framework calls for what some would call a relative “light touch” to regulations. The sharing platforms obviously would prefer that. It is not so clear citizens will be so supportive.


"Light touch regulation" that allows innovators to grow their businesses has not always been the approach in Europe to new developments in the economy that are potentially disruptive. 


All too often, regulators have applied legacy rules to new technologies and business models that have the effect of protecting incumbents and harming challengers. 


Application of legacy common carrier rules to over the top voice or messaging services provide examples. 


But technology-led innovation sometimes is hard to stop. 

“The collaborative economy is part of the digital economy but also overlaps with other economic sectors, mainly those providing services,” a supporting document says. The point is that the new businesses are significant enough in potential size that banning the new business models is deemed unwise.


Collaborative platforms operating in five key sectors of the collaborative economy generated revenues of EUR 3.6 billion in 2015 in the EU.18 In terms of gross revenues flowing to providers and platforms, the EU says. 


it is estimated that collaborative platforms facilitated EUR 28 billion of transactions in 2015 in the EU.


The largest collaborative economy sector by revenue is the peer-to-peer transportation sector, which includes ridesharing and carsharing. 


The peer-to-peer accommodation sector is the largest on the basis of commerce generated. 



Whether the proposals will be adopted is the issue. Indeed, property owners in many localities already face political pressure from residents who argue that lodging sharing directly reduces resident access to affordable housing. 

Monday, August 18, 2025

Content Industries will be Disrupted Early by AI

A new report produced by the Massachusetts Institute of Technology’s Project NANDA will probably be known for its estimates of AI project failures, in particular the claim that “95 percent of organizations are getting zero return.” 


To be sure, such evaluations must be seen in context. Tools such as ChatGPT and Copilot have been explored or piloted by 80 percent of organizations, the report notes. By definition, such use cases are not explicitly intended to affect profit and loss performance in a direct sense.


Of the 60 percent of entities that have evaluated such tools, about 20 percent have reached pilot stage and five percent are in production.


One might therefore note that since adoption remains low and experimental, actual bottom line results also are hard to quantify. 


I would not make too much of such observations, at this point. First of all, up to 70 percent of all information technology projects seem to “fail,” including inability to reach predicted outcomes. That is par for the course. 


But most of these AI efforts have not yet even reached deployment stage, so outcomes cannot yet be evaluated. 


Aside from that, the one tidbit in the report that did stand out for me was the degree of disruption we already are seeing across industries, even given the low state of significant deployment. 


As was the case for the internet, content industries already seem to be among those seeing early disruption. 


 source: MIT NANDA


Scarcity, and threats to scarcity, were likely the main reason content industries were so disrupted by the internet, but the economics of digital content distribution also played a key role. 


Content industries arguably produced tangible products in the past. The information (words, images, audio, video) was still distributed in forms that were physical (places like movie theaters, devices such as TVs and radios, media such as videotapes, audio tapes and DVDs).


But digitized content has a marginal cost of reproduction near zero. What once was scarce and physical became abundant and virtual. Distribution channels multiplied, reducing gatekeeper power. For newspapers, the shift of classified advertising to Craigslist, Google, and Facebook destroyed  newspaper revenue.


The same happened to the “bundled” product we once referred to as a newspaper, magazine, album or channel. 


Albums became single tracks (iTunes, Spotify), newspapers and magazines became articles (Google News, social media), channels and networks became videos. 


So disruption happened because the economics of scarcity collapsed once digital distribution made content abundant, free, and shareable.


AI might have similar effects: reducing scarcity; multiplying distribution options; reducing traditional gatekeeper power but substituting platform effects. 


Audiences aggregate on platforms that offer the most content and best user experience. This creates winner-take-all dynamics where a few platforms capture most of the value while creators compete for attention and revenue shares.


Attention also becomes a scarce resource. Unlike physical goods, content competes for limited human attention. When new technologies enable infinite content creation, individual pieces become less valuable even if quality remains high. And platforms tend to create the largest audiences. 


That noted, the internet primarily disrupted distribution and access. AI disrupts content creation itself. And it looks like the disruption already is being felt. 


Has AI Use Reached an Inflection Point, or Not?

As always, we might well disagree about the latest statistics on AI usage. The proportion of U.S. employees who report using artificial inte...