The newest OpenAI flagship model, GPT-5, comes with improved reasoning capabilities, but its economic footprint is also a lot sharper. The researchers write that the power consumption per query of GPT-5 is 8.6 times more than the power consumed per query of GPT-4, revealing the increasing energy cost of AI.
GPT-5 Power Consumption vs. GPT-4
As per the University of Rhode Island’s AI Lab, the average energy consumption per query of GPT-5 is 18.35 watt-hour in comparison to GPT-4 2.12 watt-hour. This puts GPT-5 in the ranks of the most power hungry AI system, just behind OpenAI o3 reasoning model and DeepSeek R1 (China).
According to Shaolei Ren, a UC Riverside professor, the “thinking mode” of GPT-5 could be five to ten times more powerful than a regular response. Additional power consumption in GPT-5 comes from multimodal functionality, processing not just text but also images and video.
A GPT-5 Energy Use Estimate By Researchers
Deployment details have not been shared yet by OpenAI. Instead, the researchers estimated the power consumption of GPT-5 based on benchmarks from Nvidia DGX H100 and H200 ng systems, the efficiency metrics of Microsoft Azure data centers, and PUE, WUE, and CIF factors.
The findings, while they admit are rough estimates that could be wrong, do lead to serious concerns surrounding GPT-5 power demand as AI begins to be used more widely around the world.
Environmental Impact of GPT-5
Now, OpenAI had previously stated that ChatGPT gets something like 2.5 billion queries a day. If every single query that ran on GPT-5 ran on it, it will consume 45 gigawatt-hours a day—equivalent to the output of two to three nuclear power plants. That’s enough to supply 1.5 million US homes with power for a day.
As climate measures get stricter, overall AI data center costs may spiral because of GPT-5 power requirements, experts caution. Others have even likened AI power consumption to Bitcoin mining: both demand is constantly expanding and the carbon footprint is not well understood.
Industry Controversy Over Power Consumption Of GPT-5
Back in June 2024, Sam Altman, CEO of OpenAI, declared that ChatGPT’s average energy consumption per-query was only 0.34 watt-hours. They were skeptical of the number, claiming it probably failed to include energy to create an image, train a model or cool the servers. According to Wired, numerous experts estimate GPT-5 power consumption is far greater than OpenAI has even implied.
FAQ
GPT-5 is estimated by researchers to use 8.6 times the energy per query than GPT-4.
The increase in energy consumption is driven by its advanced reasoning, multimodal, and prolonged “thinking mode” responses.
However, if GPT-5 was responsible for all ChatGPT calls, then the energy it uses daily would be equivalent to the production of several nuclear power plants!
No, not really: OpenAI has never made public its plans for deploying GPT-5, and some experts in the field have challenged the companyís assertion that the energy use could be lower than expected.

















