Artificial intelligence systems are thirsty, consuming as much as 500 milliliters of water – a single-serving water bottle – for each short conversation a user has with the GPT-3 version of OpenAI’s ChatGPT system. They use roughly the same amount of water to draft a 100-word email message.
That figure includes the water used to cool the data center’s servers and the water consumed at the power plants generating the electricity to run them.
But the study that calculated those estimates also pointed out that AI systems’ water usage can vary widely, depending on where and when the computer answering the query is running.
To me, as an academic librarian and professor of education, understanding AI is not just about knowing how to write prompts. It also involves understanding the infrastructure, the trade-offs, and the civic choices that surround AI.
Many people assume AI is inherently harmful, especially given headlines calling out its vast energy and water footprint. Those effects are real, but they’re only part of the story.
When people move from seeing AI as simply a resource drain to understanding its actual footprint, where the effects come from, how they vary, and what can be done to reduce them, they are far better equipped to make choices that balance innovation with sustainability.
2 hidden streams
Behind every AI query are two streams of water use.
The first is on-site cooling of servers that generate enormous amounts of heat. This often uses evaporative cooling towers – giant misters that spray water over hot pipes or open basins. The evaporation carries away heat, but that water is removed from the local water supply, such as a river, a reservoir or an aquifer. Other cooling systems may use less water but more electricity.
The second stream is used by the power plants generating the electricity to power the data center. Coal, gas and nuclear plants use large volumes of water for steam cycles and cooling.
... continue reading