Skip to main content
  • Home
  • Tech
  • “GPT-5 Consumes 20 Times More Power” — AI Boom Driving Energy Demand
“GPT-5 Consumes 20 Times More Power” — AI Boom Driving Energy Demand
Picture

Member for

4 weeks
Real name
Aoife Brennan
Bio
Aoife Brennan is a contributing writer for The Economy, with a focus on education, youth, and societal change. Based in Limerick, she holds a degree in political communication from Queen’s University Belfast. Aoife’s work draws connections between cultural narratives and public discourse in Europe and Asia.

Changed

GPT-5’s advanced capabilities cause a dramatic surge in power consumption
Developer OpenAI acknowledges the growing challenge
Big Tech companies rapidly building new data centers, heightening risks of power shortages

Analysts warn that OpenAI’s new artificial intelligence model, GPT-5, consumes vastly more electricity than its predecessors. As the model incorporates advanced functions such as long-term reasoning and multimodal processing, its power requirements have surged. With AI-driven demand for electricity climbing rapidly, concerns are mounting that AI computing centers and related facilities could soon face significant energy shortages.

ChatGPT Has Become a “Power-Hungry Giant”

On August 15 (local time), Oilprice.com reported that the University of Rhode Island had found OpenAI’s GPT-5 model consumes up to 40 watt-hours (Wh) of electricity to generate a single medium-length response of about 1,000 tokens. By comparison, early versions of ChatGPT in 2023 required roughly 2 Wh per response—meaning GPT-5’s energy use is up to 20 times higher.

Experts point to GPT-5’s advanced reasoning capabilities as the main driver of this surge. Rakesh Kumar, professor of computer engineering at the University of Illinois, noted, “AI systems like GPT-5, designed for complex and long-term reasoning, consume far more power during both training and inference than earlier versions.” Xiaolei Ren, professor at the University of California, Riverside, added that GPT-5’s inference mode and multimodal processing make it far more resource-intensive than a simple text model: “When running in inference mode, the resources required to produce the same answer can increase by five to ten times.”

Even a Simple “Thank You” Comes With a Power Bill

OpenAI is well aware of the heavy electricity burden created by generative AI. In April, CEO Sam Altman was asked how much it might cost every time people added phrases like “please” or “thank you” when using ChatGPT. He replied that the cumulative electricity bill had already reached “tens of millions of dollars.”

This enormous cost arises because more words from users mean more data for servers to process and more responses to generate—each requiring additional energy. For example, when a user types a brief phrase like “thanks for your answer,” ChatGPT is programmed to respond with something like, “You’re welcome! Let me know anytime if I can help further.” Even such short exchanges, multiplied across millions of global interactions, translate into significant electricity use.

With power consumption climbing steadily, OpenAI has started taking steps to secure a stable energy supply. One example is its partnership with Oklo, a California-based startup developing compact fast reactors designed to run on spent nuclear fuel. Altman, who co-founded and chaired Oklo, stepped down as chairman in April ahead of contract negotiations between OpenAI and Oklo to avoid a conflict of interest.

AI Data Centers Rising at Breakneck Speed

Global demand for electricity from AI is expected to grow even further as the spread of AI and digital transformation pushes major tech firms to pour staggering sums into dedicated computing facilities. This year, OpenAI, in partnership with Japan’s SoftBank and Oracle, launched the “Stargate Project” to build a massive AI center in Texas. The project involves a $500 billion investment over four years, with capacity for up to 400,000 high-performance chips.

Microsoft has announced plans to invest $80 billion in 2025 alone to expand its global network of AI computing centers, underscoring its aggressive push to secure the infrastructure needed for AI training and services. Google is spending $7 billion over two years to expand AI facilities in Iowa, while Amazon is committing $100 billion and Meta as much as $65 billion for similar efforts.

The challenge is that more AI data centers inevitably mean soaring electricity use. Market research firm IDC projects that global data center power consumption will rise by 19.5% annually from 2023 to 2028—more than doubling over that period—with AI workloads driving much of the increase. Consulting firm Gartner forecasts that AI data centers’ electricity demand will jump from 261 terawatt-hours in 2024 to 500 terawatt-hours in 2027. With usage climbing so sharply, analysts warn that annual consumption could surge by as much as 160% in just two years, leaving as many as 40% of AI computing centers struggling with power shortages by 2027.

Picture

Member for

4 weeks
Real name
Aoife Brennan
Bio
Aoife Brennan is a contributing writer for The Economy, with a focus on education, youth, and societal change. Based in Limerick, she holds a degree in political communication from Queen’s University Belfast. Aoife’s work draws connections between cultural narratives and public discourse in Europe and Asia.