What happens when you say “Hello” to ChatGPT?
Such a simple query might seem trivial, but making it possible across billions of sessions requires immense scale. While OpenAI reveals little information about its operations, we’ve used the scraps we do have to estimate the impact of ChatGPT—and of the generative AI industry in general.
This article is part of The Scale Issue.
OpenAI’s actions also provide hints. As part of the United States’ Stargate Project, OpenAI will collaborate with other AI titans to build the largest data centers yet. And AI companies expect to need dozens of “Stargate-class” data centers to meet user demand.
Estimates of ChatGPT’s per-query energy consumption vary wildly. We used the figure of 0.34 watt-hours that OpenAI’s Sam Altman stated in a blog post without supporting evidence. It’s worth noting that some researchers say the smartest models can consume over 20 Wh for a complex query. We derived the number of queries per day from OpenAI's usage statistics below. illustrations: Optics Lab
OpenAI says ChatGPT has 700 million weekly users and serves more than 2.5 billion queries per day. If an average query uses 0.34 Wh, that’s 850 megawatt-hours; enough to charge thousands of electric vehicles every day.
2.5 billion queries per day adds up to nearly 1 trillion queries each year—and ChatGPT could easily exceed that in 2025 if its user base continues to grow. One year’s energy consumption is roughly equivalent to powering 29,000 U.S homes for a year, nearly as many as in Jonesboro, Ark.
Though massive, ChatGPT is just a slice of generative AI. Many companies use OpenAI models through the API, and competitors like Google’s Gemini and Anthropic’s Claude are growing. A report from Schneider Electric Sustainability Research Institute puts the overall power draw at 15 terawatt-hours. Using the report’s per-query energy consumption figure of 2.9 Wh, we arrive at 5.1 trillion queries per year.
AI optimists expect the average queries per day to jump dramatically in the next five years. Based on a Schneider Electric estimate of overall energy use in 2030, the world could then see as many as 329 billion prompts per day—that’s about 38 queries per day per person alive on planet Earth. (That's assuming a global population of 8.6 billion in 2030, which is the latest estimate from the United Nations.) As unrealistic as that may sound, it’s made plausible by plans to build AI agents that work independently and interact with other AI agents.
The Schneider Electric report estimates that all generative AI queries consume 15 TWh in 2025 and will use 347 TWh by 2030; that leaves 332 TWh of energy—and compute power—that will need to come online to support AI growth. That implies the construction of dozens of data centers along the lines of the Stargate Project, which plans to build the first ever 1-gigawatt facilities. Each of these facilities will theoretically consume 8.76 TWh per year—so 38 of these new campuses will account for the 332 TWh of new energy required.