In a first, Google has released data on how much energy an AI prompt uses
-
Seriously. I'd be somewhat less concerned about the impact if it was only voluntarily used. Instead, AI is compulsively shoved in every nook and cranny of digital product simply to justify its own existence.
The power requirement for training is ongoing, since mere days after Sam Altman released a very underehelming GPT-5, he begins hyping up the next one.
I also never saw a calculation that took into amount my VPS costs. The fckers scrape half the internet, warming up every server in the world connected to the internet. How much energy is that?
-
Because demand for data centers is rising, with AI as just one of many reasons.
But that's not as flashy as telling people it takes the energy of a small country to make a picture of a cat.
Also interesting that we're ignoring something here -- big tech is chasing cheap sources of clean energy. Don't we want cheap, clean energy?
Didn’t xitter just install a gas powered data center that’s breaking EPA rules for emissions?
-
A flight to Europe's worth of energy is a pretty asinine way to measure this. Is it not?
It's also not that small the number, being ~600 Megawatts of energy.
However, training cost is considerably less than prompting cost. Making your argument incredibly biased.
Similarly, the numbers released by Google seem artificially low, perhaps their TPUs are massively more efficient given they are ASICs. But they did not seem to disclose what model they are using for this measurement, It could be their smallest, least capable and most energy efficient model which would be disingenuous.
A Megawatt is a unit of power not energy. It means nothing without including the duration, like Megawatt-hours
-
That's not small....
100's of Gigawatts is how much energy that is. Fuel is pretty damn energy dense.
A Boeing 777 might burn 45k Kg of fuel, at a density of 47Mj/kg. Which comes out to... 600 Megawatts
Or about 60 houses energy usage for a year in the U.S.
It's an asinine way to measure it to be fair, not only is it incredibly ambiguous, but almost no one has any reference as to how much energy that actually is.
That's not ~600 Megawatts, it's 587 Megawatt-hours.
Or in other terms that are maybe easier to understand: 5875 fully charged 100kWh Tesla batteries.
-
Yet they've never needed to commission power plants to dedicate power to these facilities. It's almost like there's some new, uniquely power-hungry demand that's driving this that's sprung up in the last 5 or so years...
huh? every data center i've been to has hundreds of megawatts of stand-by power generation which is regularly used to take up slack, because data center loads are peaky as hell. when facebook was building their first european datacenter in 2012 they were heavily criticised for their planned local power infrastructure, which was and still is diesel generators. the one being built in my area, which is a colo, has planned for over 700MW of local generation despite being built next to multiple hydro dams.
-
Because demand for data centers is rising, with AI as just one of many reasons.
But that's not as flashy as telling people it takes the energy of a small country to make a picture of a cat.
Also interesting that we're ignoring something here -- big tech is chasing cheap sources of clean energy. Don't we want cheap, clean energy?
Sure we do. Do we want the big tech corporations to hold the reins of that though?
-
Didn’t xitter just install a gas powered data center that’s breaking EPA rules for emissions?
Yes, yes it did. And as far as I can tell, it's still belching it out, just so magats can keep getting owned by it. What a world
-
In total, the median prompt—one that falls in the middle of the range of energy demand—consumes 0.24 watt-hours of electricity, the equivalent of running a standard microwave for about one second. The company also provided average estimates for the water consumption and carbon emissions associated with a text prompt to Gemini.
Thank you! I skimmed for that and gave up.
-
The company has signed agreements to buy over 22 gigawatts of power from sources including solar, wind, geothermal, and advanced nuclear projects since 2010.
None of those advanced nuclear projects are yet actually delivering power, AFAIK. They're mostly in planning stages.
The above isn't all to run AI, of course. Nobody was thinking about datacenters just for AI training in 2010. But to be clear, there are 94 nuclear power plants in the US, and a rule of thumb is that they produce 1GW each. So Google is taking up the equivalent of roughly one quarter of the entire US nuclear power industry, but doing it with solar/wind/geothermal that could be used to drop our fossil fuel dependence elsewhere.
How much of that is used to run AI isn't clear here, but we know it has to be a lot.
None of those advanced nuclear projects are yet actually delivering power, AFAIK.
...and they won't be for at least 5-10 years. In the meantime they'll just use public infrastructure and then when their generation plans fall through they'll just keep doing that.
-
the big thing to me is I want them to compare the same thing with web searches. so they want to use median then fine but median ai query to median google search.
Tbh, that won't be useful, like the guy above stated.
Google searches are very similar in terms of work that needs to be done. You could expect the average and the median to be very close. For example, take these numbers: 1,1,2,2,3. The median is 2, the average is 1.8.
AI requests vary wildly. GPT-5 for example uses multiple different internal models ranging from very small text-only models to huge, reasoning models and image generation models. While there's no way to know how much energy they use without OpenAI publishing data, you can compare how long computation takes.
For a fast, simple text-only answer ChatGPT using GPT-5 takes a second or so to start writing and maybe 5 seconds to finish. To generate an image it might take a minute or two. And if you dump some code in there and tell it to make large adaptions to the code it can take 10+ minutes to generate that. That's a factor of more than 100x. If most answers are done by the small text-only models, then the median will be in the 5 second range while the average might be closer to 100 seconds or so, so median and average diverge a lot.
-
This post did not contain any content.
The real question is why anyone would want to use more power than a regular search engine to get answers that might confidently lie to you.
-
The real question is why anyone would want to use more power than a regular search engine to get answers that might confidently lie to you.
if it's Google that they would use us the search engine, search results are turning to shit. it just often doesn't show you the relevant stuff. The AI overview is wrong. Ads sometimes take up the entire first page of results. so I see why someone would just want to show a question into the void and get a quick response instead of having to sort through five crappy results, after filtering that down from 15 possibly relevant ones
-
Sure we do. Do we want the big tech corporations to hold the reins of that though?
If cheap(er/better) energy is invented then that's good, why would tech corpos be able to "hold the reins" of it exclusively?
-
Because the training has diminishing returns, meaning the small improvements between (for example purposes) GPT 3 and 4 will need exponentially more power to have the same effect on GPT 5. In 2022 and 2023 OpenAI and DeepMind both predicted that reaching human accuracy could never be done, the latter concluding even with infinite power.
So in order to get as close as possible then in the future they will need to get as much power as possible. Academic papers outline it as the one true bottleneck.
And academia will work on that problem. It reminds me of intel processors "projected" to use kilowatts of energy, then smart people made other types of chips and they don't need 2000 watts.
-
There were people estimating 40w in earlier threads on lemmy which was ridiculous.
This seems more realistic.
40 watt hours.
-
The real question is why anyone would want to use more power than a regular search engine to get answers that might confidently lie to you.
I use DuckDuckGo. I use its AI features mainly for stock projections and to search for information on company earnings release. Because when I try to search for earnings schedule by myself, I get conflicting information. DDG AI is actually pretty useful to read troves of webpages and find the relevant information for me in that regard.
-
In total, the median prompt—one that falls in the middle of the range of energy demand—consumes 0.24 watt-hours of electricity, the equivalent of running a standard microwave for about one second. The company also provided average estimates for the water consumption and carbon emissions associated with a text prompt to Gemini.
The human mind uses about 100 watt.
The equivalent would be 400 questions per hour, 6 per minute or one every 10 seconds.
That's close to human capacity. -
Tbh, that won't be useful, like the guy above stated.
Google searches are very similar in terms of work that needs to be done. You could expect the average and the median to be very close. For example, take these numbers: 1,1,2,2,3. The median is 2, the average is 1.8.
AI requests vary wildly. GPT-5 for example uses multiple different internal models ranging from very small text-only models to huge, reasoning models and image generation models. While there's no way to know how much energy they use without OpenAI publishing data, you can compare how long computation takes.
For a fast, simple text-only answer ChatGPT using GPT-5 takes a second or so to start writing and maybe 5 seconds to finish. To generate an image it might take a minute or two. And if you dump some code in there and tell it to make large adaptions to the code it can take 10+ minutes to generate that. That's a factor of more than 100x. If most answers are done by the small text-only models, then the median will be in the 5 second range while the average might be closer to 100 seconds or so, so median and average diverge a lot.
good point.
-
If cheap(er/better) energy is invented then that's good, why would tech corpos be able to "hold the reins" of it exclusively?
Well, patents and what have you are a thing. I’m mostly thinking that I wouldn’t want e.g. Facebook to run any nuclear reactors or energy grids. That’s something I prefer the government does.
-
And academia will work on that problem. It reminds me of intel processors "projected" to use kilowatts of energy, then smart people made other types of chips and they don't need 2000 watts.
Academia literally got cut by more than a third and Microsoft is planning to revive breeder reactors.
You might think academia will work on the problem but the people running these things absolutely do not.