The explosion of AI tools worldwide is increasing exponentially, but the companies that make these tools often don't express their environmental impact in detail. Google has just released a technical paper detailing measurements for energy, emissions and water use of its Gemini AI prompts. The impact of a single prompt is, it says, minuscule. According to its methodology for measuring AI's impact, a single prompt's energy consumption is about the equivalent of watching TV for less than 9 seconds. That's quite in a single serving, except when you consider the variety of chatbots being used, with billions of prompts easily sent every day. On the more positive side of progress, the technology behind these prompts has become more efficient. Over the past 12 months, the energy of a single Gemini text prompt has been reduced by 33x, and the total carbon footprint has been reduced by 44x, Google says. According to the tech giant, that's not unsubstantial, and it's a momentum that will need to be maintained going forward. Google did not immediately respond to CNET's request for further comment. Google's calculation method considers much more The typical calculation for the energy cost of an AI prompt ends at the active machine it's been run on, which shows a much smaller per-prompt footprint. But Google's method for measuring the impact of a prompt purportedly spans a much wider range of factors that paint a clearer picture, including full-system dynamic power, idle machines, data center overhead, water consumption and more. For comparison, it's estimated that only using the active TPU and GPU consumption, a single Gemini prompt uses 0.10 watt-hours of energy, 0.12 milliliters of water and emits 0.02 grams of carbon dioxide equivalent. This is a promising number, but Google's wider methodology tells a different story. With more considerations in place, a Gemini text prompt uses 0.24Wh of energy, 0.26mL of water and emits 0.03 gCO2e -- around double across the board. Will new efficiencies keep up with AI use? Through a multilayered series of efficiencies, Google is continually working on ways to make AI's impact less burdensome, from more efficient model architectures and data centers to custom hardware. With smarter models, use cases and tools emerging daily, those efficiencies will be critical as we immerse ourselves deeper in this AI reality. For more, you should stop using ChatGPT for these things.