Many discussions of large artificial intelligence data center investments focus on the expected benefits of job creation. But there also are costs, which might include the lost tax revenue for states which offer tax incentives to data center operators; higher consumer power bills; higher water consumption; stranded utility assets and even far fewer new jobs than anticipated.
In that regard, large AI data centers remind me of large sports stadiums: the argument is that government subsidies will promote economic growth. In fact, such investments mostly shift consumer spending from one category to another, with possible net zero gains.
That is not to deny the need for large AI data centers, simply to point out that the local economic benefits might not be as often touted.
Then there are the other issues such as higher consumer electricity bills and impact on water usage.
McKinsey estimates suggest that by 2030, data centers globally will require $6.7 trillion in investment for compute operations, of which $5.2 trillion in capital expenditures will support artificial intelligence operations.
If correct, that represents nearly $7 trillion in capital outlays by 2030, and would be an increase of 3.5 times the capacity of data centers from 2025 to 2030 alone.
Data center power needs in the United States alone are expected to add about 460 terawatt-hours of demand from 2023 to 2030, three times the current level of consumption, McKinsey estimates. At the same time, data center water demand could rise about 170 percent by 2030, according to analysts at WestWater Research.
Those forecasts might be wrong on the high side, but even so, much to all of that capacity will mostly have to be built, somewhere.
And the point is that the benefits and costs will accrue to different participants in the information technology value chain, in different quantities. There will probably be less benefit for local economies, taxpayers and electricity and water ratepayers than often is assumed.