Saturday, May 31, 2025

AI as Industrial Policy has Precedents

It does appear that a partnership between the U.S. federal government and firms developing artificial intelligence is developing. Some examples include the executive branch championing the $500 billion "Stargate" infrastructure initiative led by OpenAI, Oracle, Japan's SoftBank, and the UAE's MGX.


More recently there were big deals to bring cutting-edge chips and data centers to Saudi Arabia and the UAE. 


Country

Deal Size / Commitment

Key U.S. Firms Involved

Main Components

Timeline / Notable Details

Saudi Arabia

$600 billion+ (total package)

Oracle, Google, Salesforce, AMD, Uber, DataVolt, AWS, Qualcomm

- Oracle: $14B, 10-year commitment

- DataVolt: $20B in U.S. data centers

- $80B in joint tech/AI projects

- HUMAIN: up to $10B AMD, $10B Nvidia AI hardware

- $5B AWS "AI Zone"

- $300B in immediate deals

- $1 trillion corridor by 2030 ambition

Announced May 2025; multi-year rollout 1, 2, 5, 7    

UAE

$1.4 trillion (10-year UAE investment in U.S.); 

$200B in tech/AI package

G42, Microsoft, Nvidia, Oracle, Cisco, OpenAI, SoftBank, BlackRock, Global Infrastructure Partners

- 500,000 Nvidia AI chips/year to UAE through 2027

- 5GW AI datacenter campus in Abu Dhabi (G42)

- Stargate UAE: 1GW OpenAI-led facility

- Reciprocal investment in U.S. data centers

- $100B Global AI Infrastructure Investment Partnership (GAIIP)

Announced May 2025; phased through 2030 1, 24, 6   


President Trump also signed a series of executive orders to hasten the deployment of new nuclear power reactors, with the goal of quadrupling total U.S. nuclear capacity by 2050.


Energy Secretary Chris Wright told Congress that AI is "the next Manhattan Project,” warning that losing to China is "not an option.”


That might closely resemble several past efforts by the government to spur economic growth by supporting targeted industries. 


Industry/Initiative

Era/Date

Government Role/Support

Outcomes/Impact

US Arsenals & Manufacturing

1816–1861

Collaborative R&D, standardization

Advanced machine tools, standardized parts, global manufacturing leadership 2

Erie Canal

1825

State-funded infrastructure

Opened Midwest markets, spurred trade and local development 2

Pacific Railway Act (Railroads)

1862

Land grants, federal loans

Built transcontinental railroad, linked coasts, boosted commerce 2

Federal-Aid Highway Act

1956

Direct federal funding, infrastructure

Created interstate highway system, expanded commerce, advanced construction tech 2

Shipbuilding (WWII, renewed focus)

1940s, 2020s

Direct contracts, subsidies, tax incentives

Rapid industrial expansion, military and commercial shipbuilding 4

Cybersecurity (NIST Framework)

2013–present

Voluntary standards, collaborative development

Improved critical infrastructure security, public-private trust 3

CHIPS and Science Act (Semiconductors)

2022

Large-scale subsidies, R&D funding

Domestic chip manufacturing, supply chain resilience, tech leadership 4


All that might suggest something for technology investors. In addition to the many startups sure to emerge, some of which will have the opportunity to become quite important, existing technology leaders also will be leaned on by government. 


That suggests the normal computing market mechanisms, whereby a new era of computing creates new leaders, might be modified in the early AI era. As legacy firms Microsoft and Apple seemingly survived the transition form the PC era to the present, so other firms such as Alphabet, Meta and Amazon might survive the coming transition to the AI era as well. 


That might be fairly unprecedented, but will be assisted by the need for collaboration and support of the federal government to create a big new industry. 


Friday, May 30, 2025

Sign of the Times: Mary Meeker Report Focuses on AI

It’s a sign of the times, as Mary Meeker, who used to publish internet reports annually, now is out with a new report on artificial intelligence. As you might expect, her statistics show faster growth for ChatGPT than for the internet. 


source: Bond


She also shows that ChatGPT searches are growing faster than Google search. 


source: Bond


And, as you might also expect, the cost of inferences is dropping even faster than computer memory. 

source: Bond 


And ChatGPT is being adopted faster than virtually all other internet applications as well. 



As you might also suspect, ChatGPT might reach 50-percent levels of household use in record time. 


Will AI Boost Productivity Enough to Pay for Universal Basic Income?

One important issue national policymakers will have to confront is what has to be done if artificial intelligence does lead to widespread job losses because machines really are able to do work that today is done by humans.


At a high enough level, social dislocations are inevitable. And much hinges on the unknowns of how much productivity can be lifted to pay for social programs offsetting the loss of jobs. But it seems fairly obvious that a huge increase in productivity will be required. 


One study suggests automation productivity gains at about five times to six times greater than what we have tended to see from earlier technology innovations. 


Study/Author(s)

Year

Key Contributions / AI Productivity Insights

Degree of AI Productivity Required for Robust UBI

Considerations/Caveats for UBI Implementation

Krohn, J.

2025

Analyzes AI advancements, focusing on the reduction in model size and cost of running LLMs, and the increasing capabilities of AI agents. Notes that in short time-horizon settings, top AI systems can outperform human experts.

While not providing a specific "degree" of productivity, Krohn (2025) suggests that the increasing efficiency and decreasing cost of AI models will lead to significant corporate AI adoption and productivity gains. This implies a future where AI's contribution to overall economic output is substantial enough to support broad social programs.

The study by Jon Krohn (2025) on "The "State of AI" Report 2025" focuses more on the current trends and capabilities of AI rather than directly calculating the precise productivity threshold for UBI. However, the reported exponential improvements in model efficiency, cost reduction, and agent performance strongly imply that AI is on a trajectory to generate the necessary wealth. The challenge then shifts from if AI can generate enough, to how that wealth is distributed.

Nayebi, A.

2025

Develops a closed-form condition for AI capital profits to sustainably finance UBI without additional taxes or new job creation in a Solow-Zeira economy. Examines how the AI capability threshold (productivity relative to pre-AI automation) varies.

AI systems must achieve approximately 5-6 times existing automation productivity to finance an 11%-of-GDP UBI, in a worst-case scenario where no new jobs are created. This threshold can be halved to 3 times existing automation productivity if public revenue share of AI capital is raised to 33% (from current 15%).

This study provides a concrete, model-based estimation. It highlights the importance of public revenue share from AI profits and market structure (monopolistic markets could make UBI easier to fund due to higher rents). Assumes a "worst-case scenario" of no new job creation.

Goldman Sachs Report

2023 (cited in various 2024/2025 discussions)

Projects that generative AI alone could automate 300 million full-time jobs globally, while also leading to a significant rise in global GDP (estimated at 7%).

Implicitly, the productivity gains from automating 300 million jobs and a 7% rise in global GDP are seen as the wellspring from which UBI could be funded. The sheer scale of potential productivity increase is a key factor.

Focuses on the potential for AI to create immense wealth, but the challenge remains in the distribution of that wealth. The report doesn't explicitly state the "degree of productivity" needed to fund UBI, but rather the overall economic impact.

Brynjolfsson, Li, and Raymond

2023 (cited by Brookings)

Showed that call center operators became 14% more productive when using generative AI, with gains over 30% for the least experienced workers. Also noted improved customer sentiment and lower employee attrition.

Specific percentage gains in productivity for certain tasks/occupations are reported. While not an economy-wide figure, these micro-level gains contribute to the larger picture of AI's productivity potential.

These are early, specific examples of productivity gains. Scaling these individual gains to a robust, economy-wide UBI requires further aggregation and consideration of broader economic impacts.

Korinek

2023 (cited by Brookings)

Estimates that economists can be 10-20% more productive using large language models.

Similar to Brynjolfsson et al., this provides a specific example of AI-driven productivity enhancement in a knowledge-based profession.

Like other micro-level studies, these gains need to be considered in the context of broader economic shifts and potential job displacement.

Santens, S.

2025 (various articles/discussions)

Argues that AI-driven automation is fundamentally different from previous shifts, automating cognitive labor, decision-making, and creativity, leading to productivity without widespread prosperity (i.e., increased inequality).

Argues that the current level of AI-driven productivity has already decoupled from wage growth, leading to massive inequality. Thus, the "degree of productivity" is already sufficient to address poverty, but the distribution mechanism is flawed.

Highlights that the issue might not be a lack of productivity from AI, but rather how the economic benefits are currently concentrated. Advocates for UBI as a tool to redistribute this wealth.

Bertello, G. P., & Almeida, T.

2025

Analyze the history of UBI and argue for its necessity as a new social contract in the age of AI and automation to address wage inequality, job insecurity, and widespread job losses.

Does not quantify a specific productivity degree, but implicitly argues that the disruptive potential of AI (leading to job displacement and enhanced productivity) necessitates UBI.

Emphasizes the need for sustainable funding, investment in education, and attention to social and psychological aspects, not just economic and labor market outcomes.


The core challenge is to ensure that the wealth generated by AI is sufficient to cover the UBI, without stifling innovation or leading to undesirable economic side effects such as inflation or capital flight. This implies that the rate of productivity growth must outpace the cost of UBI relative to the existing economy.


Even with significant productivity gains, the sheer cost of a meaningful UBI for an entire population is immense. For example, providing $12,000 per adult annually in the US would cost trillions of dollars.


Some observers will recall past expectations about automation or the cost of inputs (“energy too cheap to meter” in the case of nuclear energy). That has been a recurring theme for forecasters and futurists. 


Era/Technology

Key Expectations & Predictions (at the time)

"Too Cheap to Meter" Parallel for Computing/IT Goods

Underlying Technological Advances/Reasons for Optimism

Realization & Caveats

1960s-1970s: Early Mainframes & Minicomputers

Automation of administrative tasks, complex calculations, and inventory management leading to significant efficiency gains. Early visions of "paperless offices."

The idea that computation itself would become so affordable that it would be integrated into every aspect of business and decision-making, with the cost of individual calculations becoming negligible.

Moore's Law (1965): Gordon Moore's observation that the number of transistors on an integrated circuit doubles approximately every two years, leading to exponential increases in computing power at decreasing costs. This fundamental principle fueled optimism for decades.

While costs decreased, the scale of problems being tackled also grew. Early computers were expensive and complex, requiring specialized personnel and significant infrastructure, limiting the "too cheap" aspect for many goods. The focus was on efficiency rather than outright cost elimination for end products.

1980s-1990s: Personal Computers & Early Internet

Democratization of computing, empowering individuals and small businesses. Increased productivity in offices through word processing, spreadsheets, and desktop publishing. The rise of email and digital communication reduces communication costs.

The cost of information creation, storage, and transmission would become effectively free, allowing for near-instantaneous and borderless exchange of data and ideas. "Information Superhighway" promises a world connected.

Further advancements in microprocessors, memory, and storage: Smaller, more powerful, and cheaper components enabled the widespread adoption of PCs. Development of the Internet and World Wide Web: Protocols and infrastructure made global digital communication possible and increasingly affordable.

While costs plummeted for personal computing and basic internet access, the "free" aspect often overlooked the underlying infrastructure, software development, and the human capital required to leverage these tools effectively. The "productivity paradox" emerged in the 1990s, where significant IT investment didn't always translate to immediate, measurable economy-wide productivity gains.

2000s-2010s: Broadband, Cloud Computing, Mobile Devices

Ubiquitous connectivity, on-demand computing resources, and access to vast amounts of information from anywhere. Transformation of industries through e-commerce, digital media, and mobile applications.

The expectation that access to immense computing power and data storage through the cloud would be a utility-like service, with marginal costs approaching zero. Data itself would become "too cheap to meter" in terms of its accessibility and use.

Cloud Computing: Shifting from owning computing infrastructure to renting it on demand, drastically reducing upfront costs and scaling capabilities. Mobile Technology: Smartphones and tablets made computing and information access portable and pervasive. Big Data: The ability to collect, store, and analyze massive datasets, leading to new insights and efficiencies.

While cloud services significantly reduced the variable cost of computing for many, the total cost of complex IT systems, cybersecurity, and data management remained substantial. The increasing volume of data also presented new challenges and costs. The shift was more towards "cost-effective on-demand" rather than truly "too cheap to meter."

2020s (Current & Near Future): Generative AI

Automation of cognitive tasks, content generation, and sophisticated problem-solving. Significant productivity boosts across knowledge work, design, and even scientific research. Potential for AI to optimize production processes in unprecedented ways.

OpenAI CEO Sam Altman: Has stated that AI is heading toward being "too cheap to meter," particularly the cost of "intelligence" (i.e., AI inferences or "tokens"). Google CEO Sundar Pichai has echoed similar sentiments, stating "intelligence, just like air, too cheap to meter," given the rapid fall in the cost of generating AI outputs.

Large Language Models (LLMs) and Generative AI: Breakthroughs in AI architecture and training data have enabled highly capable models. Decreasing cost of AI inference: As models become more efficient and specialized hardware develops, the computational cost per AI operation is falling rapidly.

While the marginal cost of a single AI query or generated output is indeed plummeting, the initial development and training of these massive AI models are incredibly expensive. The ethical, societal, and regulatory costs associated with widespread AI adoption are also emerging. The focus is now on how this "cheap intelligence" can be effectively integrated to reduce the cost of final goods and services.


In all likelihood, AI will not reduce the net number of jobs as some fear, but also probably will not improve productivity enough to easily support new UBB policies. It might take some combination of productivity growth, elimination of existing welfare programs, new taxes or government funding to compensate for extensive AI-driven changes in an economy’s need for labor.


How to Avoid or Reduce the Danger of Model Collapse

Recursive training on synthetic data, often referred to as "model collapse," is a significant challenge for artificial intelligenc...