Tuesday, July 29, 2025

Maybe AI is Not a "Dot Com Bubble"

"Build it and they will come" is among the fundamental assumptions of dot-com era startups. The belief was that getting big fast, creating a large audience of users, was the foundation for monetization. So, many venture-backed firms essentially spent more money to acquire a user than the lifetime value of that user was reasonable to anticipate, if in fact entrepreneurs even had an idea what that number might be. 


The goal was to build market share first and figure out sustainable economics later. So instead of revenue indicators, startups used non-financial metrics such as page views, unique visitors, and user growth, rather than revenue metrics. 


To be sure, there is some element of “user engagement” that observers note. 


But artificial intelligence  represents a fundamentally different paradigm. Rather than hoping to monetize attention, AI tools aim to address concrete business problems with measurable returns from day one: productivity gains, cost savings, or revenue improvements. 


That doesn’t eliminate the danger of overinvestment. But the focus on financial returns does ground AI investments in financial reality. Instead of “build it and they will come,” AI developers are much more grounded in  "build it because they're already asking for it and willing to pay."


Also, the funding of AI firms is not led by venture capital, but by profitable, revenue-generating, profitable  hyperscalers. 


Also, where many dot-com companies were essentially trying to become digital magazines or shopping malls, competing for finite consumer attention. AI companies are building tools that enhance work. So the focus is on productivity rather than entertainment spending. 


Key Dimension

Dot-Com Era Startups

AI Firms (Current Era)

Core Offering

Web-based services (e-commerce, portals, online marketplaces, content)

AI products/services, AI-as-a-service, automation, predictive analytics, platform APIs

Value Driver

Website traffic, user acquisition, and “eyeballs”; hope for future monetization

Data-driven solutions that improve efficiency, automate tasks, and deliver predictive/personalized value

Monetization

Advertising, IPO capital, early focus on growth over profits

Subscriptions (SaaS, AIaaS), API usage, licensing, and value-based enterprise sales

Profit Timeline

Delayed, speculative—often prioritizing fast scaling “at any cost” over profitability

Earlier monetization; many AI companies achieve profitability within 2-3 years of launch

Main Users

Primarily B2C (consumers)—focus on internet adoption and convenience

Mix of B2B and B2C—solutions for enterprises (automation, analytics) and consumers (personalization)

Technology Focus

Utilized the internet for delivery; basic web technologies and e-commerce platforms

Core focus on AI/ML, deep learning, neural networks, data pipelines, and process automation

Data Dependence

Limited data collection/analysis; basic web analytics

Highly data-centric; model development, product improvement, and scaling depend on high-quality data

Scaling

Slow initial scaling, limited by server costs, bandwidth, and infrastructure

Rapid scaling via cloud computing; platform and ecosystem approaches facilitate global reach

Investment Focus

Aggressive, speculative VC inflows; little diligence; focus on market “potential” rather than fundamentals

Cautious, more rigorous VC due diligence (product, financials, market fit, defensibility)

Entry Barriers

Comparatively low—basic website, small tech team

High—requires AI/ML expertise, large datasets, significant computing resources, regulatory compliance

Sustainability

Many ventures failed after burning through capital; often lacked clear viable business models

More robust, industry-adaptable models, often with recurring revenue and adaptable offerings

Market Focus

Novelty-driven; unproven digital business models targeting new online audiences

Problem-solving, industry-agnostic, targeting specific verticals to deliver tangible value]


Some amount of overinvestment will likely happen. That tends to be the case for many new general-purpose technologies. Not all the efforts will succeed, but the assets will be rationalized, as was the case for railroads, for example. 


Phase

Description of Overinvestment

Typical Examples (Historical)

Rationalization & Consolidation Phase

Results of Consolidation

Technological Breakthrough

Emergence of new general-purpose technology (GPT); triggers widespread excitement and investment

Railroads (19th c.), Electricity, Telephony, Internet (1990s), AI (2020s)

Initial indications of value mismatch and inefficiencies

Early signs: Maintenance costs rise, market saturation

Investment Bubble/Boom

Massive capital flows seeking profit, often outpacing initial real returns or productive deployment

“Canal mania,” “Railway mania,” Dot-com bubble, early AI startups

Asset values diverge from fundamentals

Overcapacity, fragmented assets, irrational valuations

Overinvestment

Realization that not all investments are viable; bust phase, financial losses, bankruptcies, job cuts

Great Depression (1929), Dot-com bust (2000), other sector crashes

Rationalization push by firms, investors, and regulators

Asset sell-offs, bankruptcies, sector exits

Rationalization,  Reallocation

Surviving firms acquire distressed/undervalued assets; asset consolidation reduces duplication

Telecom mergers post-dot-com, Cloud provider consolidation, AI sector

Consolidation into larger, more viable entities

More efficient use of capital, streamlined offerings

Deployment,  Productivity Growth

Consolidated sector harnesses matured tech; rational investment leads to true productivity gains

Sustainable “deployment period” post-crash (see Perez, 2025)

Stable sector, less speculation, measured investment

Stable profits, innovation resumes sustainable growth


The point is that AI might not be analogous to the “dot com bubble.” It might be more akin to the investment pattern for lots of general-purpose technologies, where some amount of overinvestment eventually happens, but the assets are rationalized over time. 


Friday, July 25, 2025

Why Agentic AI "Saves" Google Search

One reason Alphabet’s equity valuation has been muted recently, compared to some other “Magnificent 7” firms, is the overhang from potential antitrust action. The other problem is the concern that the search business model could be disrupted by language model chatbots. 


We might not know much about the antitrust remedies for some months. But we might already be seeing signs that Alphabet’s innovations around integrating chatbot functionality with search are paying off. 

“We know how popular AI Overviews are because they are now driving over 10 percent more queries globally for the types of queries that show them, and this growth continues to increase over time,” said Sundar Pichai, Alphabet CEO. 


The point is that we still do not know the longer-term changes in either search or chatbot business models, especially since agent capabilities are coming. That will blur the functions of “learning, research or finding” with “taking action.


Feature

Classic Google Search

Chatbots

Google with Agentic AI

Research (Info Retrieval)

Yes

Yes

Yes

Personalized Recommendations

Yes

Yes

Yes

Multi-step Task Execution

Limited

Limited

Yes

Real-world Action Capability

No

Rarely

Yes (book, buy, schedule, etc.)

Trust & Security

High

Varies

High (Google brand)

Ecosystem Integration

Extensive

Siloed/fragmented

Extensive


Both chatbots and search engines will be able to retrieve information and act on it. The perhaps classic example is trip planning, which blends task-oriented activities and research.

 

But add agent capabilities and the “search” platform becomes the “take action” platform. For a firm such as Alphabet, that means the value of Google search as an advertising revenue generator is augmented by e-commerce revenue, something Google already does to an extent with product searches.

 

The point is that Google search has a plausible path to surviving and perhaps even outshining chabots also outfitted with agent capabilities.


Wednesday, July 23, 2025

"Speeds and Feeds" for Home Broadband: It's the PC Story

Though definitions of “broadband” matter for regulators, advocates and suppliers, in most cases “broadband capability” matters quite little for most users of internet access services. Internet access matters quite a lot, in comparison. 


The analogy perhaps is what happened with personal computers. Suppliers used to compete on clock speed and other performance metrics. Then we got to a point where performance across a broad range was "good enough" that it stopped making as much sense to keep touting performance. 

We are getting to that point with home broadband services. 

Use of a Chromebook, for example, absolutely requires internet access. But whether use of a Chromebook requires “broadband,” defined as 100 Mbps downstream, 20 Mbps upstream” is highly questionable, if true at all. 


I’ve used a Chromebook on a symmetrical gigabit-per-second connection and on Wi-Fi connections of varying quality but with downstream speeds not exceeding 100 Mbps and upstream in the mid-single digits. 


Was the user experience on Wi-Fi as good as on my symmetrical gigabit connection? No. But was it a major issue? Also, no. Keep in mind, I do no gaming, do not upload or download large files routinely, have no other users on my connection and might watch 4K but never 8K video. 



Though we often use the terms interchangeably, “internet access” is not the same as “broadband.” 


Internet access is the ability to connect to the internet, regardless of the speed or platform used. 


"Broadband" is a moving target describing internet access at defined minimum speeds. The U.S. Federal Communications Commission defines “broadband” as 100 Mbps download and 20 Mbps upload. 


So, strictly speaking, many access services do not operate at “broadband” speeds, as the definition requires. That does not mean the access is deficient, simply that it might not meet the minimums. Wi-fi access on airplanes or in public locations are typical examples where internet access is available, but not at “broadband” speeds in both directions.


In fact, even cable modem services I have used can fail to meet the definition, even when offering gigabit-per-second downstream speeds, as upstream speeds did not hit 20 Mbps. 


For most of us, the issue is whether such failures matter. Often, they do not matter much, if at all. People can do all the things they want to on many connections that fail to meet the broadband definitions. `


As a practical matter, past a certain point, “broadband” matters relatively little in terms of user experience. 


Monday, July 21, 2025

Verizon Fixed Wireless Keeps Growing Subscriber Base: What That Suggests About Demand

Most of us "dumb end users" of home broadband have probably realized the value-price proposition for broadband outweighs the "raw bandwidth" claims we are urged by internet service suppliers to consider. The success of relatively bandwidth-constrained fixed wireless is a case in point.


In a market where headline speeds are pushing past 2 Gbps and up to 5 Gbps, it might be easy to dismiss claims that far-lower speeds are adequate for many households and use scenarios. In fact, for most end users, the number of users in a household, and the number of their devices, have more to do with suitability than the actual applications those people engage with on a routine basis.


Verizon, for example, reported fixed wireless net additions of 278,000 in the second quarter of 2025, growing the base to over 5.1 million fixed wireless access subscribers. Keep in mind that those connections tend not to operate faster than about 300 Mbps in most areas where Verizon has not activated its millimeter wave spectrum assets.


Though nobody outside of Verizon actually knows, it is possible that 30 percent to 50 percent of those fixed wireless connections operate at less than 100 Mbps delivered bandwidth. It is possible that 30 percent to 40 percent of accounts can use bandwidth up to about 300 Mbps.


And possibly 10 percent to 20 percent of customers have access to speeds faster than 300 Mbps.


The point is that a growing number of households find those speed ranges to be adequate for their needs and budgets.


The company says it is positioned to achieve the next milestone of eight million to nine million fixed wireless access subscribers by 2028.


Though definitions of “broadband” matter for regulators, advocates and suppliers, in most cases “broadband capability” matters quite little for most users of internet access services. Internet access matters quite a lot, in comparison. 


Use of a Chromebook, for example, absolutely requires internet access. But whether use of a Chromebook requires “broadband,” defined as 100 Mbps downstream, 20 Mbps upstream” is highly questionable, if true at all. 


I’ve used a Chromebook on a symmetrical gigabit-per-second connection and on Wi-Fi connections of varying quality but with downstream speeds not exceeding 100 Mbps and upstream in the mid-single digits. 


Was the user experience on Wi-Fi as good as on my symmetrical gigabit connection? No. But was it a major issue? Also, no. Keep in mind, I do no gaming, do not upload or download large files routinely, have no other users on my connection and might watch 4K but never 8K video. 



Though we often use the terms interchangeably, “internet access” is not the same as “broadband.” 


Internet access is the ability to connect to the internet, regardless of the speed or platform used. 


"Broadband" is a moving target describing internet access at defined minimum speeds. The U.S. Federal Communications Commission defines “broadband” as 100 Mbps download and 20 Mbps upload. 


So, strictly speaking, many access services do not operate at “broadband” speeds, as the definition requires. That does not mean the access is deficient, simply that it might not meet the minimums. Wi-fi access on airplanes or in public locations are typical examples where internet access is available, but not at “broadband” speeds in both directions.


In fact, even cable modem services I have used can fail to meet the definition, even when offering gigabit-per-second downstream speeds, as upstream speeds did not hit 20 Mbps. 


For most of us, the issue is whether such failures matter. Often, they do not matter much, if at all. People can do all the things they want to on many connections that fail to meet the broadband definitions. `


As a practical matter, past a certain point, “broadband” matters relatively little in terms of user experience.


Manufacturing Might be Growing Where We Do Not Expect

Manufacturing employment in the United States has surpassed its pre-Covid pandemic levels, the first time since the 1970s that the sector ha...