An argument can be made that artificial intelligence operations will consume vast quantities of electricity and water, as well as create lots of new e-waste. It's hard to argue with that premise. After all, any increase in human activity--including computing intensity--will have that impact.
Some purists might insist we must be carbon neutral or not do AI. Others of us might say we need to make the same sorts of trade offs we must make everyday, for all our activities that have some impact on water, energy consumption or production of e-waste.
We have to balance outcomes and impacts, benefits and costs, while working over time to minimize those impacts. Compromise, in other words.
Some of us would be unwilling to accept "net zero" outcomes if it requires poor people to remain poor; hungry people to remain hungry.
And not all of the increase in e-waste, energy or water consumption is entirely attributable to AI operations. Some portion of the AI-specific investment would have been made in any case to support the growth of demand for cloud computing.
So there is a “gross” versus “net” assessment to be made, for data center power, water and e-waste purposes resulting from AI operations.
By definition, all computing hardware will eventually become “e-waste.” So use of more computing hardware implies more e-waste, no matter whether the use case is “AI” or just “cloud computing.” And we will certainly see more of both.
Also, “circular economy” measures will certainly be employed to reduce the gross amount of e-waste for all servers. So we face a dynamic problem: more servers, perhaps faster server replacement cycles, more data centers and capacity, offset by circular economy efficiencies and hardware and software improvements.
No comments:
Post a Comment