OpenAI will not disclose GPT-5’s energy use. It could be higher than past models
-
Bingo. If you routinely use LLM's/AI you've recently seen it first hand. ALL of them have become noticeably worse over the past few months. Even if simply using it as a basic tool, it's worse. Claude for all the praise it receives has also gotten worse. I've noticed it starting to forget context or constantly contradicting itself. even Claude Code.
The release of GPT5 is proof in the pudding that a wall has been hit and the bubble is bursting. There's nothing left to train on and all the LLM's have been consuming each others waste as a result. I've talked about it on here several times already due to my work but companies are also seeing this. They're scrambling to undo the fuck up of using AI to build their stuff, None of what they used it to build scales. None of it. And you go on Linkedin and see all the techbros desperately trying to hype the mounds of shit that remain.
I don't know what's next for AI but this current generation of it is dying. It didn't work.
Any studies about this "getting worse" or just anecdotes? I do routinely use them and I feel they are getting better (my workplace uses Google suite so I have access to gemini). Just last week it helped me debug an ipv6 ra problem that I couldn't crack, and I learned a few useful commands on the way.
-
This post did not contain any content.
OpenAI will not disclose GPT-5’s energy use. It could be higher than past models
Experts working to benchmark resource use of AI models say new version’s enhanced capabilities come at a steep cost
the Guardian (www.theguardian.com)
How can anyone look at that face and trust anything that mad man could have to say.
-
Wait so the biggest improvement came when there was a massive decline in demand?
The Demand didn't decline. The state imposed a strict high barrier to trade that prevented it from being fulfilled.
-
This post did not contain any content.
OpenAI will not disclose GPT-5’s energy use. It could be higher than past models
Experts working to benchmark resource use of AI models say new version’s enhanced capabilities come at a steep cost
the Guardian (www.theguardian.com)
If anyone has ever wondered what it would look like if tech giants went all in on "brute force" programming, this is it. This is what it looks like.
-
What happens to that heat in summer?
There's experimental storages where heat is pumped to underground pools or sand, but as far as I know there's heat exchangers and radiators to outside, so majority of excess heat is just wasted to outside. But absolute majority of them are closed loop systems since you need something else than plain water anyways to prevent freezing in the winter.
-
I disagree.
There's so much proud ignorance in this world, it's great to always remind the average person how they're contributing to problems they don't even realize.
We should bring their hypocrisy to light, or else they will legitimately go through life thinking that there isn't a problem.
So when people tell you about the positive changes they’ve made you in return like to point out that they still aren’t living up to your standards? I’m not sure that’s the way to go to motivate and inspire people to do better. But you do you I guess.
-
1 beef burger is equivalent in water consumption to 7.5million chatgpt queries.
I did a quick calculation and got to around 500 queries per quarter pounder. Lot of guesstimation and rounding though, but I’m pretty sure I got close enough to know that you’re off by quite a lot.
Edit to add:
I used 21.9kg CO2 per 1kg of beef and 4.32 grams per ChatGPT query for my rough estimate.However that 4.32 number is already over a year old. Chances are it’s way outdated but everyone still keeps on quoting it. It definitely does not take into account that ChatGPT often “thinks” now, because chain of thought is likely as expensive as multiple queries by itself. Additionally the models are more advanced than a year ago, but also more costly and that CO2 amount everyone keeps quoting doesn’t even mention which model they used. If anyone can find the original source of this number I’d be very curious.
-
I can't imagine any sane person who lives their life guided by marketing hype instead of direct knowledge and experience.
I mean fair enough but also... That makes the vast majority of managers, MBAs, salespeople and "normies" like your grandma and Uncle Bob insane.
Actually questioning stuff that sales people tell you and using critical thinking is a pretty rare skill in this day and age.
That makes the vast majority of managers, MBAs, salespeople and "normies" like your grandma and Uncle Bob insane.
Correct most of these people are insane, the average person is so fucking dumb and insane today its mind numbing.
-
This post did not contain any content.
OpenAI will not disclose GPT-5’s energy use. It could be higher than past models
Experts working to benchmark resource use of AI models say new version’s enhanced capabilities come at a steep cost
the Guardian (www.theguardian.com)
They literally don't know. "GPT-5" is several models, with a model gating in front to choose which model to use depending on how "hard" it thinks the question is. They've already been tweaking the front-end to change how it cuts over. They've definitely going to keep changing it.
-
brings in another problem, so they have to use generators, or undersea cables.
Fine: make it a data-center powered by water-wheel generators. Water powered AND cooled!
-
is there any picture of the guy without his hand up like that?
He looks like an old Chuck E Cheese animatronic. Like someone powered him down and he returned to default rest/storage mode.
-
I believe in verifiable statements and so far,with few exceptions, I saw nothing. We are now speculating on magical numbers that we can't see, but we know that ai is demanding and we know that even small models are not free. The only accessible data come from mistral, most other ai devs are not exactly happy to share the inner workings of their tools.
Even than, mistral didn't release all their data, even if they did it would only apply to mistral 7b and above, not to chatgpt.The only accessible data come from mistral, most other ai devs are not exactly happy to share the inner workings of their tools.
Important to point out this is really only valid towards Western AI companies. Chinese AI models have mostly been open source with open papers.
-
Well I mean also that they kinda suck, I feel like I spend more time debugging AI code than I get working code.
Do you use Claude Code? It's the only time I've had 90%+ success rate.
-
Well I mean also that they kinda suck, I feel like I spend more time debugging AI code than I get working code.
Do you use Claude Code? It's the only time I've had 90%+ success rate.
-
It's not, you're just personally insulted. The livestock industry is responsible for about 15% of human caused greenhouse gas emissions. That's not negligible.
you're just personally insulted.
I swear to God this attitude is why people don't like what you're saying. I am all for weighing the two against each other but the "I am more moral than thou" is why I left the church.
-
Do you use Claude Code? It's the only time I've had 90%+ success rate.
I have, and it doesn't at least not on the dev-ops stuff I work on.
-
This post did not contain any content.
OpenAI will not disclose GPT-5’s energy use. It could be higher than past models
Experts working to benchmark resource use of AI models say new version’s enhanced capabilities come at a steep cost
the Guardian (www.theguardian.com)
Pump it Sammy, pump it harder!!
-
If he did, would it change your views on eating meat at all?
Probably not, since I'm a vegan.
-
It's good to bring out the hypocrisy among you people.
I like watching you squirm whenever a logical argument causes your cognitive dissonance to flare up.
The average person is an insecure moron, and it's funny.
Le reddit moment
-
Death Industry sounds like it would be an awesome band name.
true lol