“We’re not comfortable revealing that for various reasons,” Dean told me on our call. The total number is an abstract measure that changes over time, he says, adding that the company wants users to be thinking about the energy usage per prompt.
But there are people out there all over the world interacting with this technology, not just me—and what we all add up to seems quite relevant.
OpenAI does publicly share its total, sharing recently that it sees 2.5 billion queries to ChatGPT every day. So for the curious, we can use this as an example and take the company’s self-reported average energy use per query (0.34 watt-hours) to get a rough idea of the total for all people prompting ChatGPT.
According to my math, over the course of a year, that would add up to over 300 gigawatt-hours—the same as powering nearly 30,000 US homes annually. When you put it that way, it starts to sound like a lot of seconds in microwaves.
3. AI is everywhere, not just in chatbots, and we’re often not even conscious of it.
AI is touching our lives even when we’re not looking for it. AI summaries appear in web searches, whether you ask for them or not. There are built-in features for email and texting applications that that can draft or summarize messages for you.
Google’s estimate is strictly for Gemini apps and wouldn’t include many of the other ways that even this one company is using AI. So even if you’re trying to think about your own personal energy demand, it’s increasingly difficult to tally up.
To be clear, I don’t think people should feel guilty for using tools that they find genuinely helpful. And ultimately, I don’t think the most important conversation is about personal responsibility.