0.6 C
Iceland

A New Frontier, A New Problem: GPT-5’s Power Hungry Nature Exposed

Date:

The release of OpenAI’s GPT-5 has been a major event, but the excitement is being tempered by serious concerns about its energy consumption. While the company has been notably quiet on the matter, experts are sounding the alarm. They argue that the new model’s enhanced abilities, from creating websites to solving PhD-level questions, come with a steep and unprecedented environmental cost. This lack of transparency from a leading AI developer is raising serious questions about the industry’s commitment to sustainability.
A key piece of evidence for this concern comes from a study by the University of Rhode Island’s AI lab. Their research found that a single medium-length response from GPT-5—around 1,000 tokens—can consume an average of 18 watt-hours. This is a significant increase over previous models. To put this into a more relatable context, 18 watt-hours is enough energy to power a traditional incandescent light bulb for 18 minutes. Given that a service like ChatGPT fields billions of requests daily, the total energy consumption could be enormous, potentially rivaling the daily electricity needs of millions of homes.
The surge in energy use is directly tied to the model’s increased size and complexity. Experts believe GPT-5 is substantially larger than its predecessors, with a greater number of parameters. This aligns with research from a French AI company, Mistral, which demonstrated a strong link between a model’s scale and its energy consumption. Mistral’s study concluded that a model ten times larger will have an impact that is an order of magnitude greater. This principle appears to hold true for GPT-5, with some specialists suggesting its resource usage could be “orders of magnitude higher” than even GPT-3.
Further complicating the issue is the new model’s sophisticated architecture. While it does incorporate a “mixture-of-experts” system to boost efficiency, its advanced reasoning capabilities and capacity to process video and images likely offset these gains. The “reasoning mode,” which requires the model to compute for a longer duration before generating a response, could make its energy footprint several times larger than basic text-only tasks. This convergence of size, complexity, and advanced features paints a clear picture of an AI system with an immense demand for power, leading to urgent calls for greater transparency from OpenAI and the broader AI community.

Subscribe to our magazine

━ more like this

Mark Zuckerberg’s $80 Billion Bet Confirms: The Metaverse Arrived Decades Too Early

Timing in technology is everything. Meta is shutting down Horizon Worlds on VR — off the Quest store in March, off all VR by...

Instagram Encryption Ends: The Broader Implications for Tech Companies

Meta's decision to remove end-to-end encryption from Instagram DMs, effective May 8, 2026, sends a signal that could resonate across the technology industry. The...

What People Suggest: The Short Life of Google’s Most Controversial Health AI Feature

A health AI feature from Google that crowdsourced medical advice from internet users has been discontinued, leaving behind questions about how the company manages...

Microsoft Refuses to Stay Silent as Anthropic Faces Existential Threat From Pentagon’s AI Blacklist

Microsoft has refused to stay silent in the face of what it considers an existential threat to Anthropic and the broader AI industry, filing...

Musk’s xAI Wins Approval for Permanent Methane Turbines Despite Resident Protest

Elon Musk’s xAI has successfully secured a permanent permit to operate 41 methane gas turbines at its Southaven datacenter. The Mississippi Environmental Quality Permit...