Readit News logoReadit News
steve_rambo · a year ago
> ChatGPT consumes 25 times more energy than Google

> ChatGPT consumes a lot of energy in the process, up to 25 times more than a Google search

Dead Comment

123yawaworht456 · a year ago
as much as I hate to defend openai of all the fucking things, this article is generic ragebait for bobos

>However, ChatGPT consumes a lot of energy in the process, up to 25 times more than a Google search.

per what metric?

>Per conversation of about 20 to 50 queries, half a litre of water evaporates – a small bottle, in other words.

closed circuit cooling systems don't evaporate any fucking water

>Consulting firm Gartner calculated that at this rate, AI will account for up to 3.5% of global electricity demand by 2030.

a number pulled out of someone's ass. trying to predict anything related to generative AI is pure hubris.

londons_explore · a year ago
> closed circuit cooling systems don't evaporate any fucking water.

Nobody uses closed circuit cooling systems anymore. Pretty much every datacenter is cooled by evaporative (open loop) cooling.

The enthalpy of vaporization of water is pretty high, so for a relatively large energy use, they only need to evaporate quite small amounts of water. A single household water supply (10GPM), on full blast, could cool 1.5 megawatts of computers - a small datacenter.

Arnt · a year ago
I predict that VCs will not pay for 3.5% of global electricity use by unprofitable companies.
j45 · a year ago
Each model seems to becoming more efficient and effective so maybe it's a trend that doesn't continue on the average use side.
Me1000 · a year ago
Thanks to most of the world being "GPU poor", there is a lot of research and engineering effort going into making models much more compute efficient. Another way that OpenAI gets to benefit from the world of open source/weight models.
fahrradflucht · a year ago
I think there are still _a lot_ of use cases that are currently prohibitively expensive for which increased efficiency will immediately induce matching demand.
j45 · a year ago
Fair point. I think the lagging indicator is how much alternate models are becoming capable using less horsepower.
anonzzzies · a year ago
We have an ambitious goal; mammal brain power consumption.
j45 · a year ago
That's a very tall ask lol.

There was a post of a detailed imaging of a tiny part of the brain and it contained 1.7 Petaflops of data or something.

CPUs and GPUs have a long way to go beyond nm production.

nordstreem · a year ago
Maybe. But there is no maybe about the mess being left for the future.

If future people are forfeit as we refuse to sacrifice today, why preserve people today? I say bring chaos now so the mess makers deal with their externalities for a change instead of waving them off to make “line go up”.

j45 · a year ago
What you are saying means people not learning anything new or different.

There is lots that only humans can do, including what a sentence predictor presented as AI can't imagine. It's getting better at reasoning for sure, but net new, novel stuff, I think is still the domains of most.

It's true though that GPT may offset a lot of people to have to change and grow, and it might be hardest for the people who got away with BSing. I've never had that luxury and always have had to deliver so I guess it feels a little less frightening.

Either way, I figure its easier to learn which way the currents in this realms are flowing to recognize them better, than to sit on the side of the pool looking at it with disdain.

We stop growing not only when we stop learning, but when we stop creating.

lotsofpulp · a year ago
Maybe it will work out if chatgpt leads to less people being needed, and it ends up using less water than people.
jeffbee · a year ago
The CO2 footprint of training GPT-3, 502 tons (est.), is approximately the carbon output of the air travel industry every 5 seconds. Anyone who writes about the carbon footprint of machine learning is being paid to mislead you.
LordDragonfang · a year ago
>However, ChatGPT consumes a lot of energy in the process, up to 25 times more than a Google search.

That headline is misleading (and should probably be updated. 25x more energy than a search, not 25x more than the company. Which anyone who knows anything about the two things could tell you - one is running inference, the other is a database lookup. In fact, I'd be surprised if it's only 25x in that case.

dnissley · a year ago
So the real question is whether ChatGPT is 25x more useful than a google search, which goes undiscussed in this article.
londons_explore · a year ago
[Citiation Needed]

Both companies will keep highy secret their energy use, so I doubt this article is based on anything more than a researchers guess.

jeffbee · a year ago
Not really? Google's sustainability report gives datacenter energy usage and PUE, and you can approximately apportion energy usage per site to each site based on its size. David Patterson and his team have issued a number of papers and talks with precise figures for various ML tasks, such as "Carbon Emissions and Large Neural Network Training", "The carbon footprint of machine learning training will plateau, then shrink", and "GLaM: Efficient Scaling of Language Models with Mixture-of-Experts".
rmorey · a year ago
There is no source on this in the article? Anyone have an idea where they got this from?