Saturday, April 5, 2025

Have AI Chatbots Actually Affected Search Volume?

There’s a good reason Alphabet in late March 2025 is the “Magnificent 7” stock with the lowest price-earning ratio. Investors and analysts are worried about artificial intelligence impact on the search business that drives about 57 percent of Alphabet’s total revenue, based on advertising revenue generated by the 90-percent share Alphabet holds in the search market. 


Company

Ticker

Approximate P/E Ratio

Nvidia

NVDA

72

Tesla

TSLA

41

Microsoft

MSFT

37

Meta Platforms

META

31

Apple

AAPL

27

Alphabet

GOOG

25

Amazon

AMZN

69


In its fourth quarter 2024 financial report, Alphabet reported Google Search revenue at $54 billion (up 12.5%); YouTube ads at $10.5 billion (up 13.8%); Google Network at a bit under $8 billion (down 4.1%); Google subscriptions, platforms and devices at $11.6 billion (up 7.8%); Google Cloud at about $12 billion (up 30%).


The issue is not so much that Google Search is failing to grow: it is growing. The fear is that AI could halt or reverse that growth. Equity markets being forward looking, growth expectations really do matter. 


source: Seeking Alpha


And virtually nobody disputes the notion that AI is going to cannibalize some amount of search volume. Just how much is the issue. 


 

source: Seeking Alpha


So investors will be looking for growth in other areas, ranging from YouTube ad revenue to Google Cloud services to robo taxis. 


Thursday, April 3, 2025

AI Assistant Revenue Upside Mostly Will be Measured Indirectly

Amazon expects Rufus, its AI shopping assistant, to indirectly contribute over $700 million in operating profits this year, Business Intelligence says. 


The expected upside would come in the form of "downstream impact," a metric Amazon uses to measure a product or service's potential to generate additional consumer spending across Amazon's vast offerings. Rufus, as such, generates no direct revenue, of course. 


Rufus product recommendations might lead to more purchases on Amazon's marketplace, for example. The value of advertising embedded in Rufus content are another way indirect revenue upside is measured. 


By 2027, however, it is expected to reach $1.2 billion in DSI profit contributions, according to Amazon. 


“From broad research at the start of a shopping journey such as ‘what to consider when buying running shoes?’ to comparisons such as ‘what are the differences between trail and road running shoes?’ to more specific questions such as ‘are these durable?’, Rufus meaningfully improves how easy it is for customers to find and discover the best products to meet their needs, Amazon says. 


That is likely a way most firms are going to have to rely upon to quantify their LLM assistant revenue gains. 


Use Case

Description

Revenue Impact

Customer Support Automation

AI chatbots handle FAQs and troubleshooting, reducing customer service costs.

Lowers operational costs and improves customer retention.

Lead Generation,  Qualification

AI assistants engage website visitors, collect data, and qualify leads.

Increases conversion rates and enhances sales pipeline efficiency.

E-commerce Upselling,  Cross-Selling

AI recommends relevant products based on user behavior and preferences.

Boosts average order value and sales.

Content & SEO Optimization

AI generates blog posts, product descriptions, and metadata for SEO.

Increases organic traffic, improving brand visibility and sales.

Personalized Marketing, Retargeting

AI-driven chatbots deliver personalized offers and recommendations.

Enhances engagement, conversion rates, and repeat purchases.

Employee Productivity Enhancement

AI automates repetitive tasks (e.g., email drafting, summarization, scheduling).

Saves time, allowing employees to focus on high-value tasks.

Market Research,  Insights

AI collects and analyzes customer feedback for business insights.

Improves decision-making and product-market fit.

Training, Onboarding

AI-based interactive training modules for new employees.

Reduces onboarding time and training costs.

Subscription, Membership Services

AI chatbots engage users to promote premium subscriptions.

Increases subscription revenue and customer lifetime value.

Reducing Churn,  Customer Retention

AI proactively engages users before they disengage or cancel services.

Lowers customer acquisition costs by improving retention rates.


Are Large Language Models Really "10 Times" More Energy Consumptive than Search?

Most of us have heard claims that a single chatbot (Large Language Model or generative AI system) query is significantly more energy-intensive (often cited as roughly 10 times more) than a traditional search query.


Most of us could agree that the statement about energy intensity is directionally correct for most systems at the present time, though perhaps not as big a long-term issue, as energy intensity is virtually certain to be reduced over time. 


Computational complexity obviously is an issue. Traditional search uses pre-computed indexes. Much of the “heavy lifting" (indexing the web) is done beforehand.


Large language models run a generative process through a massive neural network (often with billions or trillions of parameters). Each query requires significant computations to understand a prompt and generate a novel response. This "inference" process is inherently more computationally demanding per query than retrieving indexed information.  


Early energy estimates suggested a "10x" more energy metric. These estimates looked at computational operations (FLOPs - Floating Point Operations per Second) required for each type of task and translated that into potential energy use based on typical hardware efficiency.


But that probably already is an out-of-date way to make the comparisons. As search engines increasingly integrate generative AI into search, the difference between an LLM query and a search query is likely narrowing quite substantially, in terms of energy consumption. 


Study/Source

Year

Model(s) Analyzed (Examples)

Key Finding / Estimate per Query

Context / Notes

Luccionni, Viguier, & Ligozat (NeurIPS 2023 - originally arXiv 2022)

2022/2023

BLOOM (176B parameters)

Estimated inference energy consumption for BLOOM, varying significantly based on hardware (e.g., A100 vs. T4 GPUs). Provided methodology for carbon footprint calculation.

Focused on BLOOM, an open model. Emphasized the impact of hardware and location (electricity grid mix) on the carbon footprint. Didn't give a single universal Wh/query figure.

Patterson et al. (Google Research) (arXiv 2021)

2021

LaMDA, MUM (Conceptual / Internal Google Models)

Not a direct per-query energy figure, but stated "some models used by Search are already large," and newer AI features (like MUM) are more compute-intensive.

Context was broader discussion of model efficiency and training costs. Confirms Google's internal view that advanced AI features increase computational demands over basic search.

De Vries (Digiconomist) (Joule, 2023 & ongoing analysis)

2023

General LLMs (e.g., based on ChatGPT/GPT-3 scale)

Estimated a single ChatGPT query could consume ~0.001-0.01 kWh (1-10 Wh) on average, potentially much higher depending on complexity & hardware. Compared this to a Google search (~0.0003 kWh or 0.3 Wh).

Estimates based on assumed hardware (like Nvidia A100 GPUs), server power usage, and query processing time. Acknowledges high uncertainty. Helped popularize the ~10x search comparison.

Gupta et al. (Stanford HAI) (Working Paper / Estimates)

2023

Conceptual LLM (e.g., GPT-3 scale)

Estimated generating a single image with a diffusion model might consume as much energy as charging a smartphone. Extrapolated that text generation is also energy-intensive.

Focused partially on image generation AI but discussed text AI costs. Used comparisons to relatable actions (phone charging) to illustrate magnitude. Emphasized inference costs add up globally.

Google Public Statements / Reports (Various)

Ongoing

Google's AI Services (incl. Search Generative Experience)

Repeatedly stated that generative AI queries are more computationally intensive and thus consume more energy than traditional search queries. No specific public Wh/query figure released.

Confirms the general premise from the provider's side. Focuses on efforts to improve efficiency via hardware (TPUs) and software optimization.

University Research (Various) (e.g., studies citing FLOPs)

Ongoing

Various (BERT, GPT variants, etc.)

Often estimate FLOPs (Floating Point Operations) per query/token. E.g., a query might require trillions of FLOPs. This can be converted to energy using hardware efficiency (Joules/FLOP), leading to estimates often in the 0.1 Wh to 10 Wh range depending on assumptions.

These are often theoretical calculations based on model architecture and assumed hardware specs (e.g., Joules per FLOP for a specific GPU). Highly variable.


Also, models are becoming more energy efficient, as tends to happen with all computing processes that become more mature. 


So at this point, we really do not know much about energy consumption, except that, on today’s hardware, using today’s algorithms and compute intensity, it is logical enough to believe that more energy is required, as more computation is required. 


Still, logic also suggests that simple queries will require less computation, and therefore less energy. 

A simple classification task, retrieving a cached answer, or generating a very short response using a smaller, specialized model might have an energy cost that isn't dramatically higher than a complex search operation.


But actual consumption is certain to vary by model, by model architecture, by data center and hardware platforms. And since no “operating at scale” AI “as a service” supplier seems to have released any actual studies on the subject, we might assume they already know the energy consumption increase is significant.


Wednesday, April 2, 2025

AI Might Affect the Whole Economy, But Chip Ecosystem Not So Much

The ramifications from artificial intelligence, should it emerge as a genuine general-purpose technology, will obviously have huge potential implications for the computing industry as well, from chip design and capabilities to fabrication to the relative importance of processing functions and possible changes in the value chain related to hardware versus software and types of software. 


On the other hand, markets change all the time. It seems less clear that AI-driven changes are qualitative, at the chip end of the business, compared to the software part of the value chain. 


Taiwan’s chip fabrication dominance, largely driven by TSMC, has been tied to the Intel ecosystem for decades, for example. Intel’s x86 architecture powered the PC and server markets. 


But AI arguably is not driven by the Intel ecosystem. As computing pivots toward AI, GPUs, and accelerators like TPUs, the ecosystem arguably is liable to shift. 


Looking only at the “digital infrastructure” value chain, chips, servers, models, training and then the AI impact on software value, chip manufacturing and design likely will continue to represent 55 percent to 65 percent of value within the infra part of the value chain.


Value Chain Segment

Estimated % of Value (Revenue Share)

Key Players & Examples

AI Chip Manufacturing

35-40%

TSMC, Samsung, Intel Foundry

AI Chip Design

20-25%

NVIDIA, AMD, Google, Apple, Amazon (AWS Trainium & Inferentia)

Cloud & AI Infrastructure

15-20%

AWS, Microsoft Azure, Google Cloud, Oracle

AI Model Development & Training

5-10%

OpenAI, Anthropic, Meta, Google DeepMind

Enterprise AI Software & Applications

10-15%

Microsoft (Copilot), OpenAI (ChatGPT API), Salesforce, Adobe, ServiceNow

Edge AI & AI-Powered Devices

5-10%

Tesla (Autopilot AI), Apple (Neural Engine), Qualcomm (Snapdragon AI)


Obviously a “full” value chain would have to include the contribution to value of all markets for products used by people and businesses that include AI as part of the solution, but that ultimately will be virtually every part of an economy. 


If we might argue that the x86 ecosystem was driven by standardization, AI, so far, seems less so. AI workloads use, and perhaps can require, specialized silicon, including Nvidia graphics processor units, or Google’s Tensor Processing Units.


That doesn’t change some fundamental roles. Chip designers might still be separate from chip manufacturing. Value still will exist in intellectual property and manufacturing efficiency. Some chop run volumes might be smaller and manufacturing venues could shift away from Taiwan. 


Markets evolve over time, so this might be more a quantitative than qualitative shift. Nobody seems to believe the roles of chip design and manufacturing will fuse or that the need for chip fabs will go away as priorities shift to accelerators and parallel processing. 


Sure, the focus might shift to AI products rather than x86 processors. So the business is reframed rather than revamped. 


We probably cannot say the same about consumer and business software. In the realm of software, AI might indeed be poised to “change everything.” “AI features” are not simply being added to existing software. 


AI might conceivably disrupt entire value propositions, change user expectations and alter the economics of software. AI should make it easier for non-technical people to produce apps, as the internet enabled many content creators to flourish outside the established media firms. 


The cost of creating content or code should drop. And the way people pay for use of software could keep evolving in the direction of consumption-based pricing rather than flat-fee licenses. And advertising might be a new “pricing” tool, allowing use of software to be defrayed by advertising exposure. 


For consumers, AI arguably leads to more dynamic, adaptive experiences, shifting  focus from manual input to automation and personalization. For business software, the ability to make decisions is probably more important. 


In either case, there might be an argument to be made that software now begins to be experienced more as a “service.” 


Beyond that, software becomes more adaptive, learning from user behavior. Software also becomes less of a tool and more of an “assistant.” 


And it always is possible that whole new categories of apps are created, as once was the case for search and social media; ride-hailing and food delivery.


Have AI Chatbots Actually Affected Search Volume?

There’s a good reason Alphabet in late March 2025 is the “Magnificent 7” stock with the lowest price-earning ratio. Investors and analysts a...