Wednesday, February 14, 2024

For Most Firms, Quantifying AI Impact on Revenue Will be Indirect, Hence Difficult

How much value large language models in specific or even artificial intelligence in general can bring to any existing business process. In principle, one might see merit. In practice, the amount of incremental value generated will remain an open question for a while. 


Online shopping already uses forms of machine learning, so the incremental addition of LLM is the issue: how much additional value is added, and at what cost? 


Past studies of consumers suggest website functionality, product information, trust, convenience and personalization are top shopper concerns. 


Factor

Description

Website Usability: User-friendly interface, fast loading times, easy navigation, mobile-friendliness.

Simple checkout process, clear product information, intuitive search function.

Product Information: Comprehensive details, high-quality images, customer reviews, size guides.

360° product views, personalized recommendations, AR/VR try-on features.

Trust and Security: Secure payment options, data privacy protection, clear return policies.

Two-factor authentication, verified seller badges, money-back guarantees.

Competitive Prices: Transparency and fairness, promotions and discounts, personalized offers.

Price comparisons, dynamic pricing based on demand, loyalty programs.

Fast & Convenient Delivery: Multiple options, accurate tracking, timely updates, hassle-free returns.

Same-day delivery, express shipping, click-and-collect points, self-service returns.

Personalized Experience: Relevant product suggestions, targeted promotions, customer support.

AI-powered chatbots, wish list recommendations, purchase history analysis.

And there is an argument that AI might improve each of these dimensions, as it is used to improve website experience, enhance product information, increase consumer trust, optimize prices, improve delivery times or further customize the user experience. 

Factor

AI Capabilities

Website Usability: A/B testing for optimal layouts, personalized search results, voice-activated navigation.

Increased efficiency, reduced frustration, improved conversion rates.

Product Information: Automated content generation, personalized product recommendations, AI-powered chatbots for product inquiries.

Richer product descriptions, reduced information overload, enhanced customer engagement.

Trust and Security: Fraud detection, anomaly analysis, risk assessment for secure transactions.

Increased customer confidence, reduced fraud, stronger brand reputation.

Competitive Prices: Dynamic pricing based on market trends and customer behavior, personalized coupons and discounts.

More competitive offers, optimized pricing strategies, increased customer satisfaction.

Fast & Convenient Delivery: Predictive logistics, route optimization, automated delivery notifications.

Faster deliveries, reduced shipping costs, enhanced transparency and control.

Personalized Experience: Customer segmentation, behavioral analysis, real-time recommendations targeted marketing campaigns.

More relevant product suggestions, improved customer lifetime value, stronger brand loyalty.


The issue is perhaps not so much whether AI can help: it should. The issue is more value versus cost; value versus other efforts to improve the business model; value versus time. 


Study

Key Findings

PWC Global Consumer Insights Survey 2023: 55% of consumers say personalized recommendations influence their purchase decisions.

Personalization is key: Customers value tailored experiences and relevant product suggestions.

McKinsey & Company's State of Retail Banking 2023: 71% of customers expect retailers to anticipate their needs and offer relevant support.

Proactive assistance matters: Customers appreciate efficient solutions and timely interventions.

Accenture's Why We Buy Report 2023: 75% of shoppers research products online before buying in-store.

Seamless integration is crucial: Online and offline experiences should be complementary and consistent.

Moore's Law Explains a Lot

We will likely never know for certain how much an understanding of Moore’s Law has played a vital role in the fortunes of firms whose business models rely on internet access, but there are tantalizing examples. 


At a time when Netflix was still mailing out DVDs to its customers, internet access was still generally running at about 56 kbps, not fast enough to support video streaming. 


The problem, says Hastings in an interview today at the Wired business conference, was that back then they couldn’t stream movies over 56 kbps modems.


But there was Moore’s Law and improvements in bandwidth which could be plotted, and that is exactly what Hastings did. “We took out our spreadsheets and we figured we’d get 14 megabits per second to the home by 2012, which turns out is about what we will get.”


And Hastings arguably is not the only person whose knowledge of Moore's Law has led to surprising business conclusions. 


Perhaps the most-startling strategic assumption ever made by Bill Gates was his belief that horrendously-expensive computing hardware would eventually be so low cost that he could build his own business on software for ubiquitous devices. 


How startling was the assumption? Consider that, In constant dollar terms, the computing power of an Apple iPad 2, when Microsoft was founded in 1975, would have cost between US$100 million and $10 billion.


Optical fiber communications in the local loop does progress, in terms of bandwidth, about as fast as Moore's Law, even if the progress of optical fiber in the local access network does not necessarily progress at that rate. 


In other words, Hastings and his team understood there would come a moment when video streaming was feasible, based in large part on internet access trends propelled by Moore’s Law improvements in semiconductor technology. 


A perhaps-related insight might be inferred. Moore’s Law contributes to a trend of ever-lower costs for computation and communications. 


Over time, what that means, as a practical matter, is that applications can be created, and use cases created, that assume the cost of computing and communications is no barrier to widespread use. Some of us might point to the development of high-definition TV as an example. 


At a time when analog versions of HDTV required 40 Mbps per channel, some believed HDTV could be done in six Mbps per channel. As ait turns out, we can do so using less bandwidth than that. 


We might argue that a wide range of businesses, use cases and applications now are possible precisely because of Moore’s Law impact on the costs of computation and communication, ranging from financial technology including mobile payments to fraud detection; cloud computing; social media; e-commerce; the sharing economy; affordable artificial intelligence or the internet of things. 


Navigation apps; all forms of on-demand services; video streaming and every form of recommendation and personalization features, plus speed-to-text or text-to-speech are enabled by radically-lower costs of computation.


Each Next-Generation Mobile Network Since 2G Has Reduced Latency

As 5G core networks have shifted to a decomposed and virtual architecture, latency can become an issue, since functions can be performed remotely. But SKT and Intel say they have a way to reduce latency in virtualized 5G core networks substantially, by as much as 70 percent for transactions between the session management function (SMF) gateway and packet data unit (PDU) session microservices. 


The approach also enables a 33 percent reduction in gateway CPU usage, the firms say in a white paper


They believe the architecture will be useful for 6G, but the approach also works for 5G, illustrating the ways one mobile generation preps the way for the next, as key features and principles evolve. 


Mobile service providers would like nothing so much as a graceful evolution to “6G” performance, without disruptive changes to platform elements. Obviously, collaboration with device manufacturers, chip suppliers and other stakeholders will happen, to ensure device compatibility, standards alignment, and smooth integration of 6G technologies. 


But we should expect to see many other ways mobile operators will pursue an evolutionary 6G transition. As we have seen with 5G, existing spectrum will be leveraged, even if new spectrum allocations are made. 


Software-defined networks will facilitate network  upgrades that avoid hardware replacements. 


Network slicing might also be used to enable the coexistence of diverse 5G and 6G services on the same infrastructure.


We might also see efforts to conduct Incremental upgrades, where 6G features and functionalities are introduced in stages, in much the way that 4G voice services relied on 3G and 5G relies on 4G for voice. More advanced features, such as network slicing, might be introduced later than basic functions such as new frequency bands for capacity boosts, as happened with 5G. 


Monday, February 12, 2024

Value of Open Source AI for Meta

The debate over proprietary versus open source for building artificial intelligence models is a matter of dispute for large language models as it has tended to be in other areas of coding and software development. Meta defended its “open source” approach recently. 


Meta benefits from open source because it “improves our models, and because there's still significant

work to turn our models into products, because there will be other open source models available

anyway, we find that there are mostly advantages to being the open source leader and it doesn't

remove differentiation from our products much anyway,” said Mark Zuckerberg, Meta CEO. 


“First, open source software is typically safer and more secure, as well as more compute efficient to operate due to all the ongoing feedback, scrutiny and development from the community,” he said. “Efficiency improvements and lowering the compute costs also benefit everyone including

us.”


“Second, open source software often becomes an industry standard, and when companies

standardize on building with our stack, that then becomes easier to integrate new innovations into our 

Products,” Zuckerberg added. “Third, open source is hugely popular with developers and researchers.”


That “helps us recruit the best people at Meta, which is a very big deal for leading in any new technology area,” he argued. 


Beyond all that, Meta, as a builder of end user applications, can well take a “layered” approach to business functions. Meta benefits from universal and quality internet access, for example, but it does not have to build all that infrastructure itself. 


And if open source allows Meta to stimulate the creation of digital infrastructure, that helps its own business model, much as universal internet access allows Meta to build its own core businesses.


Enterprise Software Business Helps Microsoft Monetize AI

Compared to Amazon Web Services or Google Cloud, Microsoft benefits from an extensive enterprise software business where AI features can be added rapidly. At least for the moment, that seems to have provided a revenue spark for Microsoft overall, and for Microsoft Cloud operations in particular. 


As Amy Hood, Microsoft CFO said, “strong demand for our Microsoft cloud offerings, including AI services” contributed to the fact that “Azure and other cloud services revenue grew 30 percent and 28 percent in constant currency, including six points of growth from AI services,” Hood said. 


Satya Nadella, Microsoft CEO, pointed out that Microsoft Cloud surpassed $33 billion in revenue and was “up 24 percent,” while Azure AI customers surpassed 53,000. “Over one third are new to Azure over the past 12 months,” perhaps based on the ability to use LLMs from Cohere, Meta and Mistral on Azure without having to manage underlying infrastructure, Nadella suggested. 


Also, Microsoft noted 1.3 million paid GitHub Copilot subscribers, up 30 percent quarter over quarter, and more than 50,000 organizations using GitHub Copilot. 


Execs Believe in AI Business Model Reinvention, But Often Can Cite Little in the Way of ROI

I’m always a bit circumspect about surveys of C suite executives about their expectations about business challenges and opportunities, but more so about perceptions of opportunities than challenges. For example, a recent PwC survey of C suite execs shows high belief that generative AI (large language models) will enable business model reinvention. 

 

source: PwC 


At the same time, those same executives report--as virtually always is the case--that return on investment from technology investments remains “elusive.” 


source: PwC 


That is basically in keeping with a general rule of thumb that up to 70 percent of information technology initiatives and projects fail to produce the desired business upside. 


In other words, C suite executives typically have high hopes for what new technology can do, but 70 percent of the time find that the expected payoffs do not materialize. 


So it would be rational indeed to expect that generative AI will mostly fail to enable “new business models” for most companies and projects. 


So the findings should not come as a surprise. Many studies suggest that such transformations are relatively rare. 


Study

Methodology

Key Findings

Limitations

McKinsey & Company (2020):

Survey of 1,200 executives

Only 8% of digital transformations deliver more than 10% improvement in financial performance.

Focuses on broader digital transformations, not solely new business models.

BCG (2020):

Analysis of 1,000 transformation efforts

Only 30% of transformations met or exceeded their target value and resulted in sustainable change.

Doesn't differentiate between types of transformations.

Standish Group Chaos Report (2020):

Survey of IT professionals

Only 29% of projects are successful (meet scope, schedule, and budget).

Doesn't distinguish between types of IT projects.

MIT Sloan Management Review (2019):

Survey of 448 executives

70% of respondents reported challenges in scaling and measuring the impact of their new business models.

Limited sample size, self-reported data.

McKinsey & Company (2021):

Surveyed 1,500 executives

20%

Only 20% of digital transformations achieve their full potential, while 70% fall short.

McKinsey & Company (2021):

Surveyed 1,500 executives

20%

Only 20% of digital transformations achieve their full potential, while 70% fall short.

Boston Consulting Group (2020):

Analyzed 1,200 transformation initiatives

30%

30% of transformations met or exceeded their target value and resulted in sustainable change, while 44% created some value but didn't meet targets.

Standish Group Chaos Report (2020):

Analyzed 50,000 IT projects

64% successful

64% of projects were considered "successful" based on meeting scope, schedule, and budget, but only 29% were considered "highly successful" with significant business value.

MIT Sloan Management Review (2018):

Surveyed 1,500 executives

33%

33% of digital transformation initiatives met or exceeded expectations.

Capgemini Invent (2022):

Surveyed 1,000 global executives.

59% of respondents reported success with IT projects aimed at business model innovation, but only 22% considered it "highly successful."


PMI Pulse of the Profession (2023):

Analyzed data from over 34,000 project managers.

75% of projects met their original goals or were considered successful, but only 37% met all predefined success criteria.



And even within the AI category, it seems likely that “predictive” use cases--where past and present data is analyzed to make predictions about future behavior--is going to generate most of the identifiable returns. 


As a tool for writing code or generating content, generative AI only solves some problems. Data mining about actual customer behavior is likely to find more substantial application on the revenue and cost functions for most companies. 


Saturday, February 10, 2024

What Defines an "Era of Computing" Anymore?

In the past, it has been possible to describe computing eras strictly in terms of hardware: mainframe, mini-computer, personal computer. Sometime in the 2000s that framework began to fray, as computing moved from “business or enterprise” use cases to consumer content, commerce, education, information gathering and entertainment. 


These days, computing leadership often includes large app providers as well as hardware or software suppliers. 


Decade

Leaders

Area of Impact

Transition to Content/Social Media/Commerce

1960s to 70s

IBM

Enterprise computing

Not yet

Early 1980s

DEC, Wang Labs, HP

Mid-market computing

Not yet

Later 1980s

Microsoft, Intel

Personal computers, operating systems, semiconductor technology

Laid the foundation for widespread adoption of computing, enabling future digital content and communication.

1990s

Google, Netscape, Chrome, Amazon

Web browsers, search engines, e-commerce platforms

Facilitated access to information and commerce online, paving the way for content sharing and social interaction.

2000s

Social Media including Twitter, Facebook, YouTube, Instagram, Wordpress

Social networking platforms, microblogging platforms

Created platforms for user-generated content, real-time communication, and community building.

2010s

Apple, Samsung, Google, AWS, Azure, Google Cloud, Netflix, Google Maps

Smartphones, mobile operating systems, cloud computing, mobile apps

Expanded access to the internet and digital services through mobile devices, driving content consumption and social interaction on the go.

2020s

Nvidia, Microsoft, AWS, Google, 

AMD (and others coming)

Artificial intelligence, cloud infrastructure, edge computing

Enabling personalized content experiences, intelligent applications, and seamless integration of computing into various aspects of life.


Among other implications, it is likely the older models of “computing eras” will have to be redefined, as leadership now can come from software or content firms as easily as from hardware firms. Apple redefined its role when it switched from being a niche provider of personal computers to a major supplier of mobile devices and now services and content. 


Microsoft evolved from an operating system provider to a leader in cloud computing, enterprise and consumer applications, gaming, communications and so forth.


And today’s computing leaders mostly are dominant as providers of computing-enabled applications and services, ranging from search to social media to commerce and cloud computing “as a service.”


Hardware still matters, of course (Nvidia, for example). But eras of computing are unlikely to continue to be defined by hardware. Instead, platforms, devices and use cases seem to be what matters.


Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...