My personal use of large language models (LLMs) like ChatGPT and Microsoft Copilot has declined after a friend highlighted the extensive use of water in cooling data centers. One estimate equates 154 average ChatGPT queries to flushing a toilet.
As new data centers pop up like mushrooms in Northern Virginia, Dallas, Atlanta and other U.S. locales, it’s near impossible to ignore the expanding physical footprint of big data and AI. But how much water and electricity use can be attributed to AI?
It turns out this question is hard to answer. Per a June 2025 article in Wired Magazine, most LLMs – including those accounting for 84% of user queries in May of this year – don’t disclose emissions data. It’s intuitive that the complexity of the model, along with the complexity of the query, augment the energy requirements both of development and use. Some companies actively direct queries to more efficient models, though there may be a trade-off between the speed and quality of the response. Another consideration is that some data centers are powered by renewable sources of energy, while many others run off of fossil fuels. And the types of inputs – physical like semiconductors, and digital like datasets – also dictate the resource intensity of AI.
A series of articles in the MIT Technology Review earlier this year attempted to discern the carbon intensity of AI – now and into the future. A May article in the series posits that after a period of stabilization, data center energy consumption doubled from 2017 to 2023, coincident with the greater adoption of AI. Data centers consume roughly 4% of U.S. energy.
The cooling of data centers is accomplished with fans and millions of gallons of water (daily). Add the environmental toll of rare earths mining to fabricate the semiconductors needed to power the models, and the environmental impact is staggering. And that’s today.
I’m going to keep working on this question (without ChatGPT) and encourage you to pause before your next query too.