ChatGPT 5 power consumption could be as much as eight times higher than GPT 4 — research institute estimates medium-sized GPT-5 response can consume up to 40 watt-hours of electricity
-
OK I guess I didn't read far enough but your quote says that Deepseek uses less than Open AI?
Less than Open AI's o3, but that's because o3 was estimated to use even more power than GPT 5's 18 Wh per query.
-
The University of Rhode Island's AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT's reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.
A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI's GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.
And how many milliwatts does an actual brain use?
-
And an LLM that you could run local on a flash drive will do most of what it can do.
Can you give an example?
-
How slow?
Loading up a website with flash and GIF in the 90s dialup slow... Or worse?
It's horrendously slow, unusable imo. With the larger DeepSeek distilled models I tried that didn't fit into VRAM you could easily wait 5 minutes until it was done writing its essay. Compared to just a few seconds when it does. Bit that's with a RTX 3070 Ti, not something the average ChatGPT user has lying around probably.
-
And how many milliwatts does an actual brain use?
It's shockingly tricky to answer precisely, but the commonly held value is that a human brain runs on about 20w, regardless of the computational load placed on it.
-
It's shockingly tricky to answer precisely, but the commonly held value is that a human brain runs on about 20w, regardless of the computational load placed on it.
So maybe .5 kwh/d. Still vastly more efficient than a code-based one.
-
I have an extreme dislike for OpenAI, Altman, and people like him, but the reasoning behind this article is just stuff some guy has pulled from his backside. There's no facts here, it's just "I believe XYX" with nothing to back it up.
We don't need to make up nonsense about the LLM bubble. There's plenty of valid enough criticisms as is.
By circulating a dumb figure like this, all you're doing is granting OpenAI the power to come out and say "actually, it only uses X amount of power. We're so great!", where X is a figure that on its own would seem bad, but compared to this inflated figure sounds great. Don't hand these shitty companies a marketing win.
Yeah, I saw some people speculating GPT-5 is having worse performance in some tasks than older models but still being forced to everyone because it's a consolidation and cost cutting version, but how could it be cutting costs if it consumes 8 times more power?
-
There's such a huge gap between what I read about GPT-5 online, versus the overwhelmingly disappointing results I get from it for both coding and general questions.
I'm beginning to think we're in the end stages of Dead Internet, where basically nothing you see online has any connection to reality.
The stock market is barely connected to reality and that is required to be updated every 3 months by every single company. Just imagine what the internet's going to be like.
-
There's such a huge gap between what I read about GPT-5 online, versus the overwhelmingly disappointing results I get from it for both coding and general questions.
I'm beginning to think we're in the end stages of Dead Internet, where basically nothing you see online has any connection to reality.
Well yeah, it's a for-profit company. They exist solely to make money, that's their entire goal.
It's almost all marketing and has been for a while. ChatGPT peaked with 4o (and 4.5 if you used their API), 4.1 was a step backwards despite them calling it an improvement, and 5 was another step backwards.
They are not any closer to AGI, and we're not going to see AGI from LLMs no matter how much they claim just how close we are to seeing AGI.
-
I was just thinking, in more affordable electric regions of the US that's about $5 worth of electricity, per thousand requests. You'd tip a concierge $5 for most answers you get from Chat GPT (if they could provide them...) and the concierge is likely going to use that $5 to buy a gallon and a half of gasoline, which generates a whole lot more CO2 than the nuclear / hydro / solar mixed electrical generation, in reasonably priced electric regions of the US...
That doesn't seem right. By my calculations it should be like 5¢. Can you show your work?
-
And how many milliwatts does an actual brain use?
1.21 jigawatts
-
1.21 jigawatts
Ahh but you need 8 for a round trip
-
they vibe calculated it.
They asked chatgpt 4 about chatgpt 5 power consumption.
-
Combining "Neural | Symbolic" and "Attempto Controlled English" to achieve human readable and editable neuro-symbolic AI systems
Technology1
-
-
-
-
-
-
Microsoft Came to Bargain: Use OneDrive for Device Backup, Opt into Loyalty Program and Use Their Products Till You Earn 1000 Points or Pay $30 and They Might Give You Security Updates till Oct 2026.
Technology1
-