In a first, Google has released data on how much energy an AI prompt uses
-
Yet they've never needed to commission power plants to dedicate power to these facilities. It's almost like there's some new, uniquely power-hungry demand that's driving this that's sprung up in the last 5 or so years...
Yet they’ve never needed to commission power plants to dedicate power to these facilities.
Never?
-
There were people estimating 40w in earlier threads on lemmy which was ridiculous.
This seems more realistic.
I think that figure came from the article, and was based on some very flawed methodology.
-
I did some quick math with metas llama model and the training cost was about a flight to Europe worth of energy, not a lot when you take in the amount of people that use it compared to the flight.
Whatever you're imagining as the impact, it's probably a lot less. AI is much closer to video games then things that are actually a problem for the environment like cars, planes, deep sea fishing, mining, etc. The impact is virtually zero if we had a proper grid based on renewable.
I'd like to understand what this math was before accepting this as fact.
-
If their energy consumption actually was so small, why are they seeking to use nuclear reactors to power data centres now?
That's not small....
100's of Gigawatts is how much energy that is. Fuel is pretty damn energy dense.
A Boeing 777 might burn 45k Kg of fuel, at a density of 47Mj/kg. Which comes out to... 600 Megawatts
Or about 60 houses energy usage for a year in the U.S.
It's an asinine way to measure it to be fair, not only is it incredibly ambiguous, but almost no one has any reference as to how much energy that actually is.
-
I did some quick math with metas llama model and the training cost was about a flight to Europe worth of energy, not a lot when you take in the amount of people that use it compared to the flight.
Whatever you're imagining as the impact, it's probably a lot less. AI is much closer to video games then things that are actually a problem for the environment like cars, planes, deep sea fishing, mining, etc. The impact is virtually zero if we had a proper grid based on renewable.
A flight to Europe's worth of energy is a pretty asinine way to measure this. Is it not?
It's also not that small the number, being ~600 Megawatts of energy.
However, training cost is considerably less than prompting cost. Making your argument incredibly biased.
Similarly, the numbers released by Google seem artificially low, perhaps their TPUs are massively more efficient given they are ASICs. But they did not seem to disclose what model they are using for this measurement, It could be their smallest, least capable and most energy efficient model which would be disingenuous.
-
Seriously. I'd be somewhat less concerned about the impact if it was only voluntarily used. Instead, AI is compulsively shoved in every nook and cranny of digital product simply to justify its own existence.
The power requirement for training is ongoing, since mere days after Sam Altman released a very underehelming GPT-5, he begins hyping up the next one.
I also never saw a calculation that took into amount my VPS costs. The fckers scrape half the internet, warming up every server in the world connected to the internet. How much energy is that?
-
Because demand for data centers is rising, with AI as just one of many reasons.
But that's not as flashy as telling people it takes the energy of a small country to make a picture of a cat.
Also interesting that we're ignoring something here -- big tech is chasing cheap sources of clean energy. Don't we want cheap, clean energy?
Didn’t xitter just install a gas powered data center that’s breaking EPA rules for emissions?
-
A flight to Europe's worth of energy is a pretty asinine way to measure this. Is it not?
It's also not that small the number, being ~600 Megawatts of energy.
However, training cost is considerably less than prompting cost. Making your argument incredibly biased.
Similarly, the numbers released by Google seem artificially low, perhaps their TPUs are massively more efficient given they are ASICs. But they did not seem to disclose what model they are using for this measurement, It could be their smallest, least capable and most energy efficient model which would be disingenuous.
A Megawatt is a unit of power not energy. It means nothing without including the duration, like Megawatt-hours
-
That's not small....
100's of Gigawatts is how much energy that is. Fuel is pretty damn energy dense.
A Boeing 777 might burn 45k Kg of fuel, at a density of 47Mj/kg. Which comes out to... 600 Megawatts
Or about 60 houses energy usage for a year in the U.S.
It's an asinine way to measure it to be fair, not only is it incredibly ambiguous, but almost no one has any reference as to how much energy that actually is.
That's not ~600 Megawatts, it's 587 Megawatt-hours.
Or in other terms that are maybe easier to understand: 5875 fully charged 100kWh Tesla batteries.
-
Yet they've never needed to commission power plants to dedicate power to these facilities. It's almost like there's some new, uniquely power-hungry demand that's driving this that's sprung up in the last 5 or so years...
huh? every data center i've been to has hundreds of megawatts of stand-by power generation which is regularly used to take up slack, because data center loads are peaky as hell. when facebook was building their first european datacenter in 2012 they were heavily criticised for their planned local power infrastructure, which was and still is diesel generators. the one being built in my area, which is a colo, has planned for over 700MW of local generation despite being built next to multiple hydro dams.
-
Because demand for data centers is rising, with AI as just one of many reasons.
But that's not as flashy as telling people it takes the energy of a small country to make a picture of a cat.
Also interesting that we're ignoring something here -- big tech is chasing cheap sources of clean energy. Don't we want cheap, clean energy?
Sure we do. Do we want the big tech corporations to hold the reins of that though?
-
Didn’t xitter just install a gas powered data center that’s breaking EPA rules for emissions?
Yes, yes it did. And as far as I can tell, it's still belching it out, just so magats can keep getting owned by it. What a world
-
In total, the median prompt—one that falls in the middle of the range of energy demand—consumes 0.24 watt-hours of electricity, the equivalent of running a standard microwave for about one second. The company also provided average estimates for the water consumption and carbon emissions associated with a text prompt to Gemini.
Thank you! I skimmed for that and gave up.