Monday, September 22, 2025

"All of the Above" Helps AI Data Centers Reduce Energy Demand

Just as electrical utilities use rate differentials to shift consumer workloads to off-peak hours, so data centers supporting artificial intelligence jobs can use workload shaping to shift jobs to off-peak hours. 


"Demand-side" management options are just as important as “supply side” increases in energy infrastructure. These demand management measures include:

  • optimizing hardware

  • improving software and workload management

  • enhancing physical infrastructure.


By identifying which AI tasks are time-sensitive (such as real-time inference for a search engine) versus which are not (training a new large language model), data centers can dynamically shift computational loads. 


Non-critical tasks can be paused or slowed down during peak grid demand hours and resumed when electricity is cheaper, cleaner, or more abundant. 


In a related way, workloads can be scheduled to run when the local grid's energy mix is dominated by renewable sources like solar or wind, which can reduce overall consumption.


The AI models themselves can be designed to be more efficient.


Techniques such as quantization and pruning reduce a model's size and the number of calculations required without significantly compromising accuracy. For example, by converting a model's parameters from 32-bit to 8-bit, its energy needs can be drastically reduced.


For certain tasks, an AI model can be designed to "exit" early and provide an answer if it reaches a high degree of confidence, avoiding unnecessary processing.


Study/Source

Year

Key Findings and Impact

Duke University, Nicholas Institute

2025

Found U.S. power grids could add nearly 100 GW of new flexible load if data centers curtailed their usage an average of 0.5% of the time annually, with an average curtailment time of about two hours. This demonstrates a significant, untapped potential for integrating large loads without costly grid upgrades.

Rocky Mountain Institute (RMI)

2025

States that if new data centers in the U.S. were to meet an annual load curtailment rate of 0.5%, it could make nearly 100 GW of new load available. The study emphasizes that temporal flexibility (demand response) offers benefits for both data centers (lower energy bills) and utilities (avoiding costly infrastructure upgrades).

Google Cloud Blog

2023

Describes a pilot program where Google used its "carbon-intelligent computing platform" for demand response. By shifting non-urgent tasks, they successfully reduced power consumption at their data centers during peak hours, supporting grid reliability in regions like Taiwan and Europe.

Emerald AI (Boston University)

2025

A technology field test in Phoenix, Arizona, showed that Emerald AI's software could reduce a data center's power usage by 25% during a period of peak electricity demand while maintaining service level agreements. The study highlights the potential of AI-driven strategies to dynamically adjust power usage and transform data centers into "virtual power plants."

174 Power Global

2025

Discusses how smart grid integration allows data centers to participate in demand response programs. It notes that facilities can shift computational workloads based on energy availability and cost, for example, by increasing processing for non-time-sensitive tasks during periods of high renewable energy generation.


As racks become denser with high-performance GPUs, liquid cooling systems can be up to 30 percent more energy-efficient than air cooling.


Separating the hot air exhausted by servers from the cold air intake also helps. By using "hot aisle/cold aisle" layouts with containment panels or curtains, data centers can prevent the air from mixing, allowing cooling systems to run less frequently.


“Free Cooling” in colder climates takes advantage of favorable outdoor temperatures. A data center can use outside air or water to cool the facility, bypassing mechanical chillers and significantly reducing energy consumption. 


Optimized Uninterruptible Power Supply (UPS) systems can reduce electrical losses by bypassing certain components when utility power is stable.


Data centers additionally can generate their own power using on-site solar, fuel cells, or battery storage. This keeps the data centers “off the grid” during peak demand on electrical utility networks. 


Server power capping limits the maximum power a server can draw, preventing  over-provisioning.


The point is that there always are multiple ways data centers can optimize their power usage to reduce electrical utility demand.


Sunday, September 21, 2025

When is AI Not a Threat to Critical Thinking?

All of us frequently see or hear warnings that using artificial intelligence might reduce a person’s critical thinking skills (creativity, problem-solving, and independent thinking). We might simply point out that such skills are unevenly distributed in any human population, in any case.


That noted, there is some research indicating that AI tools are creating new behavioral patterns through cognitive offloading mechanisms, rather than simply amplifying pre-existing critical thinking tendencies.


In other words, using AI can reduce critical thinking skills as people offload such activities to the AI apps. But much of that offloading might be situational. For example, some studies might suggest that heavy use of AI leads to diminishing critical thinking skills.


But light to moderate use might not have such effects. Individuals who already engage in balanced, thoughtful technology use may indeed maintain their critical thinking skills, while those prone to over-reliance experience more significant impacts.


So the emerging picture might be nuanced. AI appears to both reflect existing individual differences (some people are more prone to over-reliance) and actively create new cognitive patterns through offloading mechanisms.


As always, the use of any tool or technology depends on the circumstances of its use. It is how individual humans use the tools that determines outcomes.


Ethical and moral frameworks might be embodied in law, for example, but how tools get used also hinges on the actual behavior of the humans who use the tools and technology. 


And, obviously, any technology will have costs or externalities and unintended consequences.


Technology / Tool

Good Use Example

Bad Use Example

Unintended Consequence

AI (General)

Diagnosing diseases[4]

Creating deepfakes[6]

Privacy loss, bias propagation[6][2]

Smartphones

Emergency alerts[3]

Cyberbullying[3]

Addiction, attention erosion[3][8]

Social Media

Connecting support groups[3]

Spreading hate speech[3]

Misinformation, polarization[3]

Surveillance Cameras

Enhancing public safety[7]

Invasive monitoring[7]

Trust erosion, discrimination[7][2]

Water/Carbon-Intense Manufacturing

Renewable energy storage[11][5]

High-pollution processes[5]

Resource depletion, water stress[1][5]

Genetic Engineering

Disease resistance in crops[4]

Bioweapon development[4]

Ecological imbalance[4][8]

Autonomous Vehicles

Traffic safety[3]

Algorithmic favoritism[3]

Job displacement, new liability models[3][2]

Online Learning

Global education access[4]

Cheating facilitation[4]

Educational gaps, critical thinking erosion[2]


It might also be fair to point out that most studies of the impact of AI on critical thinking skills are recent (2024-2025), with limited longitudinal data. So we really do not know what the long-term effects might be. 


Also, there is the issue of how we define "critical thinking." Most definitions revolve around the ability to reflect on the use of sources. Critical thinking is an intellectually-disciplined process of actively and skillfully analyzing, evaluating, and synthesizing information. 


So the issue includes an AI engine user's inability to analyze information sources the user has no access to, as the AI engines have done the assembling. That does not mean the user cannot think about the logic and reasonableness of conclusions or statements; apply context and knowledge to any such statements. 


But the ability to critique and choose information sources picked by the AI engine is more difficult, unless the queries include some requirement for annotation and reference to the websites from which conclusions were developed. 


If critical thinking is the ability to make informed, reasoned judgments rather than simply accepting information at face value, lack of access to sources does matter. 


There are other critical thinking elements that lack of access to original sources might compromise:

  • Analysis and Interpretation: the ability to break down complex information into smaller parts, identify underlying assumptions, and understand the relationships between different ideas. It includes discerning the main idea, recognizing patterns, and clarifying the meaning of data or arguments. 

  • Evaluation and Inference: assessing the quality of information, arguments, and evidence. Critical thinkers evaluate the credibility of sources, recognize biases, and determine whether conclusions are logically supported. Inference is the skill of drawing reasoned conclusions from the available evidence.

  • Problem-Solving: ability to define a problem clearly, gather relevant information, brainstorm potential solutions, evaluate the pros and cons of each, and implement the best course of action.

  • Self-Regulation and Open-Mindedness: the ability to monitor and correct your own thinking. It means being aware of your own biases, being willing to challenge your beliefs, and being receptive to alternative viewpoints. 


Of all these elements, it is the inability to consistently and transparently determine the credibility of sources and biases which is most obviously a limitation. Least affected is the ability to retain an open mind. 


And critical thinking is not a universally necessary skill for all activities. Critical thinking is essential for high-stakes, complex, and novel tasks. These are situations where a person cannot rely on habit, routine, or a simple set of instructions.


Complex problems without a clear solution also are important instances where critical thinking skills like analysis and evaluation are crucial. 


Critical thinking likewise is vital for making significant decisions where the outcome has serious consequences. 


Critical thinking also often drives innovation and creativity.


Still, it still is likely that the biggest danger is the inability to evaluate sources used by an AI engine. 


On the other hand, there are many instances where critical thinking really is not important. 


For many routine, low-stakes, or repetitive tasks, a loss of critical thinking ability may have minimal or no negative impact. 


Also, structured learning, especially at a basic level might hinge more on  memorization and recall rather than analysis.


In these low-stakes or routine scenarios, offloading cognitive tasks to AI or relying on established patterns does not pose a significant risk, we might argue.


Thursday, September 18, 2025

AI: Correlation is Not Causation

Is productivity higher for people and firms that use artificial intelligence software? And, if so, did the AI "cause" the changes?


Anthropic's Economic Index takes a look at where Claude is being used, and for what purposes, by consumers and businesses across the world.  The implication is that AI use has some positive impact. But we might not be able to make that claim, yet.


Nor will we conclusively be able to claim that the AI produced the observed outcomes.


For the moment, we might only be able to observe increased usage, and be watching for outcomes to change.


Education and science usage shares are on the rise, while the use of Claude for coding continues to dominate the sample at 36 percent of total instances. But Claude use for  educational tasks increased from 9.3 percent to 12.4 percent, while use for scientific tasks from 6.3 percent to 7.2 percent.


Anthropic also notes a shift towards autonomy. “Directive” conversations, where users delegate complete tasks to Claude, grew from 27 percent to 39 percent. The study also notes increased use in coding (+4.5 percentage points) and a reduction in debugging (-2.9 percentage points). 


But we might also note the difference between correlation and causation, as there will be a tendency for value chain suppliers to argue that AI usage “produces” or “causes” observed performance gains (revenue, income, profit margin, productivity). 


In fact, quite the opposite could be happening. High AI usage occurs in industries, countries or by individuals who are already wealthy, well educated and working in settings where cognitive or intangible products are an important part of the output. 


In other words, high AI adoption follows firm and industry success, rather than “causing” it. It’s similar to the “correlation versus causation” argument we might have about home broadband “causing” economic development. 


Some might note that high-quality home broadband tends to be deployed in areas of higher density, higher wealth, higher income and higher education. Quality home broadband (“fastest speeds”) does not cause the wealth, income or educational attainment.


Rather, such characteristics create the demand for such services. 


source: Anthropic 


Many studies have noted the tension between correlation and causation when evaluating the impact of new technologies. 


  • Acemoglu et al. (2023) “Advanced Technology Adoption: Selection or Causal Effects?” Firms adopting advanced technologies had higher productivity before adoption, suggesting selection effects rather than pure technological causationLongitudinal firm-level analysis using Census dataPre-existing firm characteristics → Technology adoption

  • Autor, Levy & Murnane (2003) “The Skill Content of Recent Technological Change” Computer adoption correlated with pre-existing skill demands rather than creating new skill requirements. 

  • Caselli & Coleman (2001) “Cross-Country Technology Diffusion: The Case of Computers” Countries with higher skilled labor adopted computers faster; computer adoption didn't independently increase skill premiums. 

  • Krueger (1993) “How Computers Have Changed the Wage Structure” Workers using computers earn higher wages, but much of the premium reflects selection of skilled workers into computer-using jobs. 

  • DiNardo & Pischke (1997) “The Returns to Computer Use Revisited: Have Pencils Changed the Wage Structure Too?” Computer wage premium largely reflects unobserved worker heterogeneity, as similar premium exists for pencil use.

  • Beaudry, Doms & Lewis (2010) “Should the Personal Computer Be Considered a Technological Revolution?” Computer adoption followed rather than preceded productivity gains in most industries.

  • Forman, Goldfarb & Greenstein (2012) “The Internet and Local Wages” Internet adoption increased wages more in cities with complementary skilled workforce and business services

  • Akerman, Gaarder & Mogstad (2015) “The Skill Complementarity of Broadband Internet” Broadband access increased demand for skilled workers but only in firms/regions with existing high skill levels

  • Bloom, Sadun & Van Reenen (2012) “Americans Do IT Better: US Multinationals and the Productivity Miracle” Management practices explain technology adoption and productivity gains; technology alone insufficient

  • Cariolle (2021) “International Connectivity and the Digital Divide” Submarine cable connections improve economic outcomes primarily in countries with existing institutional capacity

  • Hjort & Poulsen (2019) “The Arrival of Fast Internet and Employment in Africa” Fast internet increased employment in skilled jobs but decreased it in unskilled jobs

  • Jensen (2007) “The Digital Provide: Information Technology, Market Performance, and Welfare” Mobile phone adoption by fishermen improved market efficiency, but required existing market infrastructure

  • Aker (2010) “Information from Markets Near and Far” Mobile phone coverage reduced price dispersion only in markets with existing trading relationships

  • Duflo & Saez (2003) “The Role of Information and Social Interactions in Retirement Plan Decisions” Retirement plan participation increased after information sessions, but mainly among already financially sophisticated employees

  • Kling & Liebman (2004) “Experimental Analysis of Neighborhood Effects on Youth” Moving to better neighborhoods improved outcomes, but families that moved had different characteristics than non-movers

  • Malamud & Pop-Eleches (2011) “Home Computer Use and the Development of Human Capital” Home computers had mixed effects on student achievement; benefits concentrated among students with higher initial ability

  • Vigdor, Ladd & Martinez (2014) “Scaling the Digital Divide: Home Computer Technology and Student Achievement” Computer and internet access at home had negative effects on student achievement for disadvantaged students


Study (Year)

Subject

Key Findings

Direction of Causality

Bils and Klenow (2000)

The Causal Impact of Education on Economic Growth

Correlation between education and growth may be due to reverse causality; richer, faster-growing states find it easier to increase education spending.

Primarily from economic growth to education, with a feedback loop.

Comin et al. (2012)

How Technology Adoption Affects Global Economies

The rate at which nations adopted new technologies centuries ago strongly affects whether they are rich or poor today. Technology adoption lags account for a significant portion of income differences.

Technology adoption has a long-term causal effect on economic prosperity.

Nazarov (2019)

Causal relationship between internet use and economic development in Central Asia

A unidirectional causality exists from GDP per capita to Internet use, suggesting that economic growth stimulates technology adoption.

From GDP per capita to technology use.


"All of the Above" Helps AI Data Centers Reduce Energy Demand

Just as electrical utilities use rate differentials to shift consumer workloads to off-peak hours, so data centers supporting artificial int...