OpenAI will not disclose GPT-5’s energy use. It could be higher than past models
-
This post did not contain any content.
-
This post did not contain any content.
It's the same tech. It would have to be bigger or chew through "reasoning" tokens to beat benchmarks. So yeah, of course it is.
-
This post did not contain any content.
I have to test it with Copilot for work. So far, in my experience its "enhanced capabilities" mostly involve doing things I didn't ask it to do extremely quickly. For example, it massively fucked up the CSS in an experimental project when I instructed it to extract a React element into its own file.
That's literally all I wanted it to do, yet it took it upon itself to make all sorts of changes to styling for the entire application. I ended up reverting all of its changes and extracting the element myself.
Suffice to say, I will not be recommending GPT 5 going forward.
-
This post did not contain any content.
Obviously it's higher. If it was any lower, they would've made a huge announcement out of it to prove they're better than the competition.
-
I have to test it with Copilot for work. So far, in my experience its "enhanced capabilities" mostly involve doing things I didn't ask it to do extremely quickly. For example, it massively fucked up the CSS in an experimental project when I instructed it to extract a React element into its own file.
That's literally all I wanted it to do, yet it took it upon itself to make all sorts of changes to styling for the entire application. I ended up reverting all of its changes and extracting the element myself.
Suffice to say, I will not be recommending GPT 5 going forward.
That's my problem with "AI" in general. It's seemingly impossible to "engineer" a complete piece of software when using LLMs in any capacity that isn't editing a line or two inside singular functions. Too many times I've asked GPT/Gemini to make a small change to a file and had to revert the request because it'd take it upon itself to re-engineer the architecture of my entire application.
-
This post did not contain any content.
Duh. Every company like this "suddenly" starts withholding public progress reports, once their progress fucking goes downhill. Stop giving these parasites handouts
-
I have to test it with Copilot for work. So far, in my experience its "enhanced capabilities" mostly involve doing things I didn't ask it to do extremely quickly. For example, it massively fucked up the CSS in an experimental project when I instructed it to extract a React element into its own file.
That's literally all I wanted it to do, yet it took it upon itself to make all sorts of changes to styling for the entire application. I ended up reverting all of its changes and extracting the element myself.
Suffice to say, I will not be recommending GPT 5 going forward.
We moved to m365 and were encouraged to try new elements. I gave copilot an excel sheet, told it to add 5% to each percent in column B and not to go over 100%. It spat out jumbled up data all reading 6000%.
-
This post did not contain any content.
So like, is this whole AI bubble being funded directly by the fossil fuel industry or something? Because the AI training and the instantaneous global adoption of them is using energy like it's going out of style. Which fossil fuels actually are (going out of style, and being used to power these data centers). Could there be a link? Gotta find a way to burn all the rest of the oil and gas we can get out of the ground before laws make it illegal. Makes sense, in their traditional who gives a fuck about the climate and environment sort of way, doesn't it?
-
I have to test it with Copilot for work. So far, in my experience its "enhanced capabilities" mostly involve doing things I didn't ask it to do extremely quickly. For example, it massively fucked up the CSS in an experimental project when I instructed it to extract a React element into its own file.
That's literally all I wanted it to do, yet it took it upon itself to make all sorts of changes to styling for the entire application. I ended up reverting all of its changes and extracting the element myself.
Suffice to say, I will not be recommending GPT 5 going forward.
Sounds like you forgot to instruct it to do a good job.
-
This post did not contain any content.
Sam Altman looks like an SNL actor impersonating Sam Altman.
-
This post did not contain any content.
So more energy use for what the people that are into AI are calling a worse model. Is someone going to get fired for this?
-
That's my problem with "AI" in general. It's seemingly impossible to "engineer" a complete piece of software when using LLMs in any capacity that isn't editing a line or two inside singular functions. Too many times I've asked GPT/Gemini to make a small change to a file and had to revert the request because it'd take it upon itself to re-engineer the architecture of my entire application.
I make it write entire functions for me, one prompt = one small feature or sometimes one or two functions which are part of a feature, or one refactoring. I make manual edits fast and prompt the next step. It easily does things for me like parsing obscure binary formats or threading new piece of state through the whole application to the levels it's needed, or doing massive refactorings. Idk why it works so good for me and so bad for other people, maybe it loves me. I only ever used 4.1 and possibly 4o in free mode in Copilot.
-
So like, is this whole AI bubble being funded directly by the fossil fuel industry or something? Because the AI training and the instantaneous global adoption of them is using energy like it's going out of style. Which fossil fuels actually are (going out of style, and being used to power these data centers). Could there be a link? Gotta find a way to burn all the rest of the oil and gas we can get out of the ground before laws make it illegal. Makes sense, in their traditional who gives a fuck about the climate and environment sort of way, doesn't it?
I mean, AI is using like 1-2% of human energy and that's fucking wild.
My take away is we need more clean energy generation. Good things we've got countries like China leading the way in nuclear and renewables!!
-
I make it write entire functions for me, one prompt = one small feature or sometimes one or two functions which are part of a feature, or one refactoring. I make manual edits fast and prompt the next step. It easily does things for me like parsing obscure binary formats or threading new piece of state through the whole application to the levels it's needed, or doing massive refactorings. Idk why it works so good for me and so bad for other people, maybe it loves me. I only ever used 4.1 and possibly 4o in free mode in Copilot.
It's a lot of people not understanding the kinds of things it can do vs the things it can't do.
It was like when people tried to search early Google by typing plain language queries ("What is the best restaurant in town?") and getting bad results. The search engine had limited capabilities and understanding language wasn't one of them.
If you ask a LLM to write a function to print the sum of two numbers, it can do that with a high success rate. If you ask it to create a new operating system, it will produce hilariously bad results.
-
I mean, AI is using like 1-2% of human energy and that's fucking wild.
My take away is we need more clean energy generation. Good things we've got countries like China leading the way in nuclear and renewables!!
All I know is that I'm getting real tired of this Matrix / Idiocracy Mash-up Movie we're living in.
-
Obviously it's higher. If it was any lower, they would've made a huge announcement out of it to prove they're better than the competition.
I’m thinking otherwise. I think GPT5 is a much smaller model - with some fallback to previous models if required.
Since it’s running on the exact same hardware with a mostly similar algorithm, using less energy would directly mean it’s a “less intense” model, which translates into an inferior quality in American Investor Language (AIL).
And 2025’s investors doesn’t give a flying fuck about energy efficiency.
-
I have to test it with Copilot for work. So far, in my experience its "enhanced capabilities" mostly involve doing things I didn't ask it to do extremely quickly. For example, it massively fucked up the CSS in an experimental project when I instructed it to extract a React element into its own file.
That's literally all I wanted it to do, yet it took it upon itself to make all sorts of changes to styling for the entire application. I ended up reverting all of its changes and extracting the element myself.
Suffice to say, I will not be recommending GPT 5 going forward.
Ai assumes too fucking much. I'd used it to set up a new 3D printer with klipper to save some searching.
Half the shit it pulled down was Marlin-oriented then it had the gall to blame the config it gave me for it like I wrote it.
"motherfucker, listen here..."
-
Sounds like you forgot to instruct it to do a good job.
"If you do anything else then what i asked your mother dies"
-
I’m thinking otherwise. I think GPT5 is a much smaller model - with some fallback to previous models if required.
Since it’s running on the exact same hardware with a mostly similar algorithm, using less energy would directly mean it’s a “less intense” model, which translates into an inferior quality in American Investor Language (AIL).
And 2025’s investors doesn’t give a flying fuck about energy efficiency.
They probably wouldn't really care how efficient it is, but they certainly would care that the costs are lower.
-
This post did not contain any content.
Photographer1: Sam, could you give us a goofier face?
*click* *click*
Photographer2: Goofier!!
*click* *click* *click* *click*