OpenAI’s GPT-5 has been celebrated for its enhanced capabilities, but it’s also shining a spotlight on a less glamorous side of AI: its massive energy consumption. With the company remaining silent on the issue, independent researchers are providing a crucial public service by benchmarking the model’s resource use. Their findings suggest that the new model’s impressive performance comes at a steep environmental cost, challenging the industry to be more transparent about its impact.
The numbers are a wake-up call for the AI community. A research group at the University of Rhode Island’s AI lab has found that a medium-length response from GPT-5 consumes an average of 18 watt-hours. This is a substantial increase from previous models and places GPT-5 among the most energy-intensive AIs ever benchmarked. To put this into perspective, 18 watt-hours is enough to power an incandescent light bulb for 18 minutes. Given that ChatGPT handles billions of requests daily, the total energy consumption could be enough to power 1.5 million US homes, highlighting a sustainability issue of national scale.
This dramatic increase in power consumption is primarily due to the model’s size. While OpenAI has not disclosed the parameter count for GPT-5, experts believe it is “several times larger than GPT-4.” This aligns with a study by the French AI company Mistral, which found a “strong correlation” between a model’s size and its energy consumption. The study concluded that a model ten times bigger would have an impact one order of magnitude larger. This suggests that the trend of building ever-larger AI models, championed by many in the industry, will continue to drive up resource usage.
The new features of GPT-5 also contribute to its high energy demands. Its “reasoning mode” and ability to process video and images require more intensive computation than simple text generation. A professor studying the resource footprint of AI models noted that using the reasoning mode could increase resource usage by a factor of “five to 10.” This means that while a “mixture-of-experts” architecture offers some efficiency, the new, more complex tasks are driving the overall energy footprint to new heights. The urgent calls for transparency from AI developers are a direct response to this growing environmental concern.
The Hidden Burden of AI: GPT-5’s Energy Demand Exposed
9