Friday, April 26, 2024
AI Impact on Data Centers
Lenovo CIO Study Finds a "To be Expected" Assessment of AI
According to Lenovo's third annual study of global CIOs surveyed 750 leaders across 10 global markets, CIOs do not expect to see clear and positive return on investment from their artificial intelligence investments for two to three years.
We should not find this surprising. Consider the last generally-recognized general-purpose technology--the internet--and the lag in perceived benefits.
Early internet technologies (1995, for example) were less mature and reliable compared to today, with slow connection speeds (dial-up internet was the consumer standard in 1995), limited functionality (the shift to multimedia web had just begun in 1995), while enterprises had to allay their security concerns.
The internet disrupted traditional business models, so companies needed time to develop new strategies for marketing, sales, and customer service in the digital space. That took time.
Also, though it seems clear enough now, the potential applications of the internet for businesses weren't fully understood at first. Experimentation was required.
Additionally, assessing the return on investment for early internet initiatives was difficult, as firms lacked the analytics tools to quantify the impact of online marketing, e-commerce, or other internet-based activities.
Complicating matters was the widespread failure of many e-commerce startups in the dotcom bust around 2000. Since whole firms failed, benefits were zero or negative.
Study | Publication Venue, Year | Key Findings |
"Why E-Business Fails" by Andrew McAfee | Harvard Business Review, 2002 | Analyzed early e-commerce ventures and found many failed to deliver on promises, highlighting the need for a strategic shift beyond simply setting up a website. |
"The Productivity Paradox in Information Technology" by Erik Brynjolfsson and Lorin M. Hitt | Journal of Economic Perspectives, 1997 | Examined the early years of IT adoption and the difficulty in measuring clear productivity gains initially, suggesting a time lag for realizing benefits. |
"Diffusion of Internet Commerce: A Study of Knowledge Acquisition" by Sang-Pil Han, Young-Gul Kim, and Yoonkyung Kim | Journal of Electronic Commerce Research, 2003 | Focused on small businesses and found that knowledge acquisition and overcoming technical challenges were crucial for successful internet adoption. |
Diffusing the Dot-Com Revolution: The State of Business Transformation in the New Millennium"James C. Brancheau, Richard B. Clark, and Thomas G. Rowan | 2001 | This study found that many companies struggled to transform their businesses for the internet in the late 1990s, and the early benefits were primarily cost reductions rather than significant revenue growth |
"Understanding Digital Marketing ROI: A Literature Review and Synthesis"Magali Ferro, Pauline Pinheiro, and David Thomas | 2014 | This review of research on digital marketing ROI (Return on Investment) highlights the challenges of measuring the impact of online marketing efforts, particularly in the early days when attribution models were less sophisticated. |
That tends to be the case with most information technology innovations, other studies have found, looking at IT in general, e-commerce in specific or productivity.
Study Title | Publication Venue | Date | Key Conclusions |
The Elusive ROI of IT Investments | Strategic Management Journal | 1997 | Examined IT investments in large firms and found difficulty in directly measuring ROI (Return on Investment) due to factors like long-term strategic benefits and integration challenges. |
From Bricks to Clicks: Does IT Pay Off? | Information Systems Research | 2002 | Analyzed data from over 200 firms and found a delayed effect of e-commerce initiatives on profitability. Early adopters often faced challenges like website development costs and changing consumer behavior. |
The Productivity Paradox in Information Technology | The Review of Economic Studies | 2003 | Investigated the impact of IT on US productivity growth in the 1990s and found a "productivity paradox" where benefits weren't immediately apparent. The study suggests a "learning period" was needed for firms to leverage the internet effectively. |
A Longitudinal Analysis of Web Site Traffic and Sales | Marketing Science | 2004 | Analyzed website traffic and sales data for multiple firms and found a positive correlation, but it took time for website traffic to translate into significant sales growth. |
The Productivity Paradox in a Service Economy | Quarterly Journal of Economics | 1998 | Robert J. Gordon analyzed data from the US economy and found a productivity slowdown despite the rise of computers and the internet in the 1980s and 1990s. The study suggests a lag between technology adoption and measurable economic impact. |
Diffusing the Dot-Com Revolution: An Organizational Perspective | Academy of Management Journal | 2000 | Andrew S. Melville, Thomas Durand, and Nina G. Guyader explored how established firms adopted e-commerce in the late 1990s. They found challenges in integrating new technologies with existing processes, leading to slow initial returns. |
From Bricks to Clicks: Determinants of Success in Online Retailing | Journal of Retailing | 2002 | Kenneth C. Lichtenstein, James A. Lumpkin, and Elizabeth Van Wijnbergen analyzed early online retailers. They identified the need for significant investments in infrastructure and marketing before online channels became profitable. |
Why E-Business Fails | Harvard Business Review | 1999 | Dorothy Leonard-Barton argued that many early e-commerce ventures failed due to a lack of strategic planning and a focus on technology alone, neglecting organizational change and customer experience. |
The point is that, of course it will take some time for CIOs to demonstrate meaningful outcomes from applied AI. That is always the case when an important new technology--to say nothing of a general-purpose technology, is introduced.
Whole business processes have to be redesigned, generally speaking, before the innovations can work their magic and produce measurable outcomes.
Thursday, April 25, 2024
AI Physical Interfaces Not as Important as Virtual
Microsoft’s dedicated AI key on some keyboards--which opens up access to Microsoft’s Copilot--now is joined by Logitech’s Signature AI mouse, with a button to open up the Logi AI Prompt Builder software and ChatGPT.
Both might be viewed as equivalent to shortcut keys that simplify access to a chosen AI engine or feature on a device, and likely will be pitched as an easier way for some users to use a specific AI engine, though in principle more-evolved versions by these or other suppliers might offer access to a user’s chosen engine or engines, or offer context-aware AI functions.
The issue there is the eternal balance between the values of curation (walled gardens that simplify or unify experience, provide greater security and consistency of experience) and the values of openness (flexibility, power, choice at the cost of complexity, risk and security).
Some might view such interfaces as gimmicks of a sort, and they also represent walled garden approaches to use of generative AI.
Proponents might argue such buttons or keys provide value by making it easier for users to use one AI engine.
Instead of navigating menus or opening separate applications, a single click on the AI button brings up Logitech's AI Prompt Builder software, for example. Users can write prompts, customize the desired response tone and complexity, and receive results directly within the Prompt Builder window.
Using the AI button, users also might be able to quickly access ChatGPT's “summarize” function for documents or emails, as well.
AI keys or buttons might be useful for beginners or those unfamiliar with navigating menus, some might argue.
Skeptics might argue this is reminiscent of the early days of the multimedia web, when AOL offered a walled garden internet experience deemed helpful for new users, but was less helpful--or limiting--for users who had some experience. The idea then was to simplify the user experience, and perhaps AI buttons and keys will do that for some new AI users.
One might envision such keys or buttons launching built-in AI assistants enabling voice commands or dictation. In some cases functions might become context aware.
Ideally, users might be able to program the buttons or keys to perform their preferred AI action, such as providing image editing suggestions or content creation prompts.
So AI buttons and keys are an experiment in making AI features more accessible and user-friendly.
In that sense, they resemble voice interfaces for smart speakers, and speech-to-text functions, which aim to make interacting with technology more natural and efficient. But, as with earlier efforts to simplify access, users might quickly outgrow the interfaces, opting instead for the more-flexible and powerful use of menus or other open-ended interfaces.
Using the AOL analogy, users rather quickly outgrew the walled garden interface and opted instead to rely on open, general-purpose browsers.
And some innovations simply do not catch on. Smart speakers have failed to become a dominant interface, though voice-to-text functions on smartphones are routinely used.
It remains to be seen whether walled garden keys and buttons actually provide the intended value, and if so, for how long before users become more AI-proficient and outgrow the interfaces.
The Graphical User Interface (GUI) and touchscreens, on the other hand, provide classic examples of successful interfaces. And there always are exceptions. The Apple iOS is a walled garden that works. Windows Phone is an example of a failed walled garden.
In other words, there always is a tension between curation of experience and choice, customization, broad access to features and an open approach.
Some of us might be so convinced AI keys or buttons will be enduring interfaces.
Meta Warns Significant AI Profits are "Several Years Away"
Meta CEO Mark Zuckerberg sets a tone of realism about investments in artificial intelligence, suggesting meaningful AI revenue is still a few years away. “Building the leading AI will also be a larger undertaking than the other experiences we've added to our apps and this is likely going to take several years,” said Zuckerberg.
Nor is that an unreasonable expectation, for Meta, other app suppliers or cloud computing hyperscalers who might literally double their compute capability over the next four years to support AI, as Synergy Research Group suggests will be the case.
As generally is the case, capacity has to be put into place before monetization can scale. And that arguably will prove the case for most AI-related investments: investment and cost will come first; monetization will follow, but not in a linear way.
"Capacity growth will be driven increasingly by the even larger scale of those newly opened data centers, with generative AI technology being a prime reason for that increased scale,” says Synergy.
Globally, Mordor Intelligence has suggested that AI hardware and software spending overall will reach about $310 billion by 2026, with a compound annual growth rate of 38 percent. Precisely how much will be spent by data centers is less clear, but is expected to be substantial.
Year | Processing CapEx (USD Billion) | Storage CapEx (USD Billion) | Source | Discussion |
---|---|---|---|---|
2021 | 50-70 | 20-30 | Synergy Research Group (2022) | Estimates based on overall data center CapEx growth and industry trends related to AI adoption. |
2022 | 55-75 | 25-35 | Gartner (2023) | Estimates based on data center equipment sales figures and analyst projections for AI hardware growth. |
2023 | 60-80 | 30-40 | IDC (2023) | Forecasts based on hyperscale data center spending surveys and analysis of enterprise AI deployments. |
2024 | 65-85 | 35-45 | Mordor Intelligence (2022) | Projections based on AI hardware market growth and anticipated increase in data center infrastructure spending. |
2025 | 70-90 | 40-50 | Cowen Research (2023) | Analyst estimates based on industry surveys and projections for continued growth in AI workloads and data volumes. |
Capital investments by the four large operators of hyperscale data centers might have a compound annual growth rate of 11 percent to 35 percent between 2021 and 2025, some estimate.
Year | Estimated Hyperscale Data Center CapEx (Processing & Storage) | Source | Discussion |
2021 | $80 Billion - $100 Billion | Synergy Research Group (2022) | This is an estimate for total CapEx on processing and storage in hyperscale data centers, not specifically for AI. |
2022 | $85 Billion - $105 Billion | Synergy Research Group (2023) | Similar to 2021, this represents total CapEx, but a portion will likely be directed towards AI needs. |
2023 | $90 Billion - $115 Billion | Gartner (2023) | Gartner predicts a 6.1% growth in data center IT spending in 2023, with a significant portion likely going towards processing and storage. |
2024 | $95 Billion - $125 Billion | IDC (2023) | IDC forecasts worldwide data center spending to reach $352 billion in 2024, with hyperscale CapEx on processing and storage being a major driver. |
2025 | $100 Billion - $135 Billion | Mordor Intelligence (2022) | Mordor Intelligence predicts a CAGR of 13.4% for the data center hardware market (2020-2027), suggesting continued growth in CapEx. |
Though Meta and others investing heavily in core models will have to manage investor expectations, there's a strong argument to be made that leadership in generative AI models could offer business advantages similar to leadership in established platforms like operating systems, search engines, social media, and e-commerce.
Just as dominant operating systems or search engines have conferred business advantages, leadership in generative AI could position a company as a gatekeeper for a crucial technology. Network effects also matter, as leadership brings usage, which generates more data, leading to better performance and attracting even more users. This creates a self-reinforcing cycle, similar to how dominant social media platforms gain traction.
Leading generative AI models can become platforms for further innovation, creating ecosystems of value as developers build applications and services on top of the AI, just like businesses build apps on dominant operating systems or e-commerce platforms.
Wednesday, April 24, 2024
Whatever the Eventual Impact, Telecom Execs Say They are Investing in AI
With the caveat that early reported interests, tests, trials and investments in new technology such as artificial intelligence--especially those deemed important--will overstate the degree of actual deployment, telecommunications professionals say they are investing in machine learning, deep learning, generative AI, high-performance computing and digital twins or metaverse at rates that might surprise some observers, according to a survey sponsored by Nvidia.
Of course, technology investment is pursued in order to obtain some business advantage. According to surveyed professionals, those desired outcomes include better customer experience, productivity enhancements, better network operations cost savings and revenue growth.
Many respondents report non-zero changes (as would be expected). In terms of inputs, the report says “43 percent of respondents reported an investment of over $1 million in AI.” About seven percent of respondents claimed AI investments in excess of $50 million.
None of that should come as a surprise, given the attention executives now place on reassuring stakeholders that they are “doing something” about AI. In 2023, AI was mentioned in 394 earnings calls, representing nearly 80 percent of all Fortune 500 companies, according to Stanford University’s Human-Centered AI institute.
AI Wiill Indeed Wreck Havoc in Some Industries
Creative workers are right to worry about the impact of artificial intelligence on jobs within the industry, just as creative workers were r...
-
We have all repeatedly seen comparisons of equity value of hyperscale app providers compared to the value of connectivity providers, which s...
-
It really is surprising how often a Pareto distribution--the “80/20 rule--appears in business life, or in life, generally. Basically, the...
-
Who gets to use spectrum, and concerns about interference from other users, now appears to be an issue for Google’s Project Loon in India. ...