ChatGPT 5 power consumption could be as much as eight times higher than GPT 4 — research institute estimates medium-sized GPT-5 response can consume up to 40 watt-hours of electricity
-
The University of Rhode Island's AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT's reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.
A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI's GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.
-
The University of Rhode Island's AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT's reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.
A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI's GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.
Bit of a clickbait. We can't really say it without more info.
But it's important to point out that the lab's test methodology is far from ideal.
The team measured GPT-5’s power consumption by combining two key factors: how long the model took to respond to a given request, and the estimated average power draw of the hardware running it.
What we do know is that the price went down. So this could be a strong indication the model is, in fact, more energy efficient. At least a stronger indicator than response time.
-
The University of Rhode Island's AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT's reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.
A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI's GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.
Of course there are comments doubting the accuracy, which by itself is valid, but they are merely doing it to defend AI. IMHO, even at a fifth of the estimates, we’re talking humongous amounts of power, all for a so-so search engine, half arsed chatbots and dubious nsfw images mostly. And let’s not forget: it may be inaccurate and estimates are TOO LOW. Now wouldn’t that be fun?
-
The University of Rhode Island's AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT's reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.
A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI's GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.
-
The University of Rhode Island's AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT's reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.
A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI's GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.
40Wh or 18Wh which is it?
That's my old gaming PC running a game for 2min42sec-6minutes ... Roughly.
-
The University of Rhode Island's AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT's reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.
A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI's GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.
I don’t buy the research paper at all. Of course we have no idea what OpenAI does because they aren’t open at all, but Deepseek's publish papers suggest it’s much more complex than 1 model per node… I think they recommended like a 576 GPU cluster, with a scheme to split experts.
That, and going by the really small active parameter count of gpt-oss, I bet the model is sparse as heck.
There’s no way the effective batch size is 8, it has to be waaay higher than that.
-
40Wh or 18Wh which is it?
That's my old gaming PC running a game for 2min42sec-6minutes ... Roughly.
they vibe calculated it.
-
The University of Rhode Island's AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT's reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.
A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI's GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.
I don't care how rough the estimate is, LLMs are using insane amounts of power, and the message I'm getting here is that the newest incarnation uses even more.
BTW a lot of it seems to be just inefficient coding as Deepseek has shown.
-
The University of Rhode Island's AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT's reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.
A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI's GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.
But we get a huge increase in accuracy, from 30% to 30.5%! And it only took 5x the energy consumption!
-
The University of Rhode Island's AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT's reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.
A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI's GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.
-
I don’t buy the research paper at all. Of course we have no idea what OpenAI does because they aren’t open at all, but Deepseek's publish papers suggest it’s much more complex than 1 model per node… I think they recommended like a 576 GPU cluster, with a scheme to split experts.
That, and going by the really small active parameter count of gpt-oss, I bet the model is sparse as heck.
There’s no way the effective batch size is 8, it has to be waaay higher than that.
And perhaps even more importantly, the per-token cost of GPT-5's API is less than GPT-4's. That's why OpenAI was so eager to move everyone onto it, it means more profit for them.
-
And perhaps even more importantly, the per-token cost of GPT-5's API is less than GPT-4's. That's why OpenAI was so eager to move everyone onto it, it means more profit for them.
I don’t believe api costs are tied all that closely to the actual cost to openAI. They seem to be selling at a loss, and they may be selling at an even greater loss to make it look like they are progressing. The second openAI seems like they have plateaued, their stock evaluation will crash and it will be game over for them.
-
I don’t believe api costs are tied all that closely to the actual cost to openAI. They seem to be selling at a loss, and they may be selling at an even greater loss to make it look like they are progressing. The second openAI seems like they have plateaued, their stock evaluation will crash and it will be game over for them.
I based my argument on actual numbers that can be looked up and verified. You "believe" that they "seem" to be doing something else. Based on what?
-
I based my argument on actual numbers that can be looked up and verified. You "believe" that they "seem" to be doing something else. Based on what?
To be fair, OpenAI's negative profitability has been extensively reported on.
Your point stands though; there's no evidence they're trying to decrease revenue. On the contrary, that would be a huge red flag to any vested interests.
-
How does OpenAI getting less money (with a cheaper model) mean more profit? Am I missing something?
-
Their point is that those API prices might not match reality, and the prices may be artificially low to build hype and undercut competitors. We don't know how much it costs OpenAI, however we do know that they're not making a profit.
-
Their point is that those API prices might not match reality, and the prices may be artificially low to build hype and undercut competitors. We don't know how much it costs OpenAI, however we do know that they're not making a profit.
Sure, they might not. But he gives no basis for saying that other than what he "believes."
People in this community, and on the Fediverse in general, seem to be strongly anti-AI and would like to believe things that make it sound bad and unprofitable. So when an article like this comes along and says exactly what you want to believe it's easy to just nod and go "knew it!" Rather than investigating the reasons for those beliefs and risking finding out something you didn't want to know.
-
How does OpenAI getting less money (with a cheaper model) mean more profit? Am I missing something?
Usually, companies will make their product say 25% cheaper to produce, then sell it to the public at a 20% discount (while loudly proclaiming to the world about that 20% price drop) and pocket that 5% increase in profits. So if OpenAI is dropping the price by x, it's safe to assume that the efficiency gains work out to x+1.
-
Their point is that those API prices might not match reality, and the prices may be artificially low to build hype and undercut competitors. We don't know how much it costs OpenAI, however we do know that they're not making a profit.
Or it might not. It would be a huge short term risk to do so.
As FaceDeer said, that we truly don't know.
-
I don't care how rough the estimate is, LLMs are using insane amounts of power, and the message I'm getting here is that the newest incarnation uses even more.
BTW a lot of it seems to be just inefficient coding as Deepseek has shown.
BTW a lot of it seems to be just inefficient coding as Deepseek has shown.
Kind of? Inefficient coding is definitely a part of it. But a large part is also just the iterative nature of how these algorithms operate. We might be able to improve that via code optimization a little bit. But without radically changing how these engines operates it won't make a big difference.
The scope of the data being used and trained on is probably a bigger issue. Which is why there's been a push by some to move from LLMs to SLMs. We don't need the model to be cluttered with information on geology, ancient history, cooking, software development, sports trivia, etc if it's only going to be used for looking up stuff on music and musicians.
But either way, there's a big 'diminishing returns' factor to this right now that isn't being appreciated. Typical human nature: give me that tiny boost in performance regardless of the cost, because I don't have to deal with. It's the same short-sighted shit that got us into this looming environmental crisis.