In a first, Google has released data on how much energy an AI prompt uses
-
There were people estimating 40w in earlier threads on lemmy which was ridiculous.
This seems more realistic.
40 watt hours.
-
The real question is why anyone would want to use more power than a regular search engine to get answers that might confidently lie to you.
I use DuckDuckGo. I use its AI features mainly for stock projections and to search for information on company earnings release. Because when I try to search for earnings schedule by myself, I get conflicting information. DDG AI is actually pretty useful to read troves of webpages and find the relevant information for me in that regard.
-
In total, the median prompt—one that falls in the middle of the range of energy demand—consumes 0.24 watt-hours of electricity, the equivalent of running a standard microwave for about one second. The company also provided average estimates for the water consumption and carbon emissions associated with a text prompt to Gemini.
The human mind uses about 100 watt.
The equivalent would be 400 questions per hour, 6 per minute or one every 10 seconds.
That's close to human capacity. -
Tbh, that won't be useful, like the guy above stated.
Google searches are very similar in terms of work that needs to be done. You could expect the average and the median to be very close. For example, take these numbers: 1,1,2,2,3. The median is 2, the average is 1.8.
AI requests vary wildly. GPT-5 for example uses multiple different internal models ranging from very small text-only models to huge, reasoning models and image generation models. While there's no way to know how much energy they use without OpenAI publishing data, you can compare how long computation takes.
For a fast, simple text-only answer ChatGPT using GPT-5 takes a second or so to start writing and maybe 5 seconds to finish. To generate an image it might take a minute or two. And if you dump some code in there and tell it to make large adaptions to the code it can take 10+ minutes to generate that. That's a factor of more than 100x. If most answers are done by the small text-only models, then the median will be in the 5 second range while the average might be closer to 100 seconds or so, so median and average diverge a lot.
good point.
-
If cheap(er/better) energy is invented then that's good, why would tech corpos be able to "hold the reins" of it exclusively?
Well, patents and what have you are a thing. I’m mostly thinking that I wouldn’t want e.g. Facebook to run any nuclear reactors or energy grids. That’s something I prefer the government does.
-
And academia will work on that problem. It reminds me of intel processors "projected" to use kilowatts of energy, then smart people made other types of chips and they don't need 2000 watts.
Academia literally got cut by more than a third and Microsoft is planning to revive breeder reactors.
You might think academia will work on the problem but the people running these things absolutely do not.
-
Academia literally got cut by more than a third and Microsoft is planning to revive breeder reactors.
You might think academia will work on the problem but the people running these things absolutely do not.
Found the American.
-
Well, patents and what have you are a thing. I’m mostly thinking that I wouldn’t want e.g. Facebook to run any nuclear reactors or energy grids. That’s something I prefer the government does.
Nuclear reactors already exist, that's not new tech.
-
Found the American.
Did the EU suddenly develop a tech industry overnight or are you unaware where all the major AI companies are located?
-
Because demand for data centers is rising, with AI as just one of many reasons.
But that's not as flashy as telling people it takes the energy of a small country to make a picture of a cat.
Also interesting that we're ignoring something here -- big tech is chasing cheap sources of clean energy. Don't we want cheap, clean energy?
Sir we do not make reasonable points in here, you’re supposed to hate AI irrationally and shut up.
-
I did some quick math with metas llama model and the training cost was about a flight to Europe worth of energy, not a lot when you take in the amount of people that use it compared to the flight.
Whatever you're imagining as the impact, it's probably a lot less. AI is much closer to video games then things that are actually a problem for the environment like cars, planes, deep sea fishing, mining, etc. The impact is virtually zero if we had a proper grid based on renewable.
Please show your math.
One Nvidia H100 DGX AI server consumes 10.2kW at 100% utilization, meaning that one hour’s use of one server is equivalent to the electricity consumption of the typical USA home in one year. This is just a single 8-GPU server; it excludes the electricity required by the networking and storage hardware, let alone the electricity required to run the facility’s climate control.
xAI alone has deployed hundreds of thousands of H100 or newer GPUs. Let’s SWAG 160K GPUs = ~20K DGX servers = >200MW for compute alone.
H100 is old. State of the art GB200 NVL72 is 120kW per rack.
Musk is targeting not 160K, but literally one million GPUs deployed by the end of this year. He has literally flown in multiple natural gas power plants which he is now operating without any environmental permits or controls, to the detriment of the locals in Memphis.
This is just one company training one typical frontier model. There are many competitors operating at similar scale and sadly the vast majority of their new capacity is running on hydrocarbons because that’s what they can deploy at the scale they need today.
-
Because demand for data centers is rising, with AI as just one of many reasons.
But that's not as flashy as telling people it takes the energy of a small country to make a picture of a cat.
Also interesting that we're ignoring something here -- big tech is chasing cheap sources of clean energy. Don't we want cheap, clean energy?
AI is the driver of the parabolic spike in global data center buildouts. No other use case comes close in terms of driving new YoY growth in tech infra capex spend.