> closed circuit cooling systems don't evaporate any fucking water.
Nobody uses closed circuit cooling systems anymore. Pretty much every datacenter is cooled by evaporative (open loop) cooling.
The enthalpy of vaporization of water is pretty high, so for a relatively large energy use, they only need to evaporate quite small amounts of water. A single household water supply (10GPM), on full blast, could cool 1.5 megawatts of computers - a small datacenter.
Thanks to most of the world being "GPU poor", there is a lot of research and engineering effort going into making models much more compute efficient. Another way that OpenAI gets to benefit from the world of open source/weight models.
I think there are still _a lot_ of use cases that are currently prohibitively expensive for which increased efficiency will immediately induce matching demand.
Maybe. But there is no maybe about the mess being left for the future.
If future people are forfeit as we refuse to sacrifice today, why preserve people today? I say bring chaos now so the mess makers deal with their externalities for a change instead of waving them off to make “line go up”.
What you are saying means people not learning anything new or different.
There is lots that only humans can do, including what a sentence predictor presented as AI can't imagine. It's getting better at reasoning for sure, but net new, novel stuff, I think is still the domains of most.
It's true though that GPT may offset a lot of people to have to change and grow, and it might be hardest for the people who got away with BSing. I've never had that luxury and always have had to deliver so I guess it feels a little less frightening.
Either way, I figure its easier to learn which way the currents in this realms are flowing to recognize them better, than to sit on the side of the pool looking at it with disdain.
We stop growing not only when we stop learning, but when we stop creating.
The CO2 footprint of training GPT-3, 502 tons (est.), is approximately the carbon output of the air travel industry every 5 seconds. Anyone who writes about the carbon footprint of machine learning is being paid to mislead you.
>However, ChatGPT consumes a lot of energy in the process, up to 25 times more than a Google search.
That headline is misleading (and should probably be updated. 25x more energy than a search, not 25x more than the company. Which anyone who knows anything about the two things could tell you - one is running inference, the other is a database lookup. In fact, I'd be surprised if it's only 25x in that case.
Not really? Google's sustainability report gives datacenter energy usage and PUE, and you can approximately apportion energy usage per site to each site based on its size. David Patterson and his team have issued a number of papers and talks with precise figures for various ML tasks, such as "Carbon Emissions and Large Neural Network Training", "The carbon footprint of machine learning training will plateau, then shrink", and "GLaM: Efficient Scaling of Language Models with Mixture-of-Experts".
> ChatGPT consumes a lot of energy in the process, up to 25 times more than a Google search
Dead Comment
>However, ChatGPT consumes a lot of energy in the process, up to 25 times more than a Google search.
per what metric?
>Per conversation of about 20 to 50 queries, half a litre of water evaporates – a small bottle, in other words.
closed circuit cooling systems don't evaporate any fucking water
>Consulting firm Gartner calculated that at this rate, AI will account for up to 3.5% of global electricity demand by 2030.
a number pulled out of someone's ass. trying to predict anything related to generative AI is pure hubris.
Nobody uses closed circuit cooling systems anymore. Pretty much every datacenter is cooled by evaporative (open loop) cooling.
The enthalpy of vaporization of water is pretty high, so for a relatively large energy use, they only need to evaporate quite small amounts of water. A single household water supply (10GPM), on full blast, could cool 1.5 megawatts of computers - a small datacenter.
There was a post of a detailed imaging of a tiny part of the brain and it contained 1.7 Petaflops of data or something.
CPUs and GPUs have a long way to go beyond nm production.
If future people are forfeit as we refuse to sacrifice today, why preserve people today? I say bring chaos now so the mess makers deal with their externalities for a change instead of waving them off to make “line go up”.
There is lots that only humans can do, including what a sentence predictor presented as AI can't imagine. It's getting better at reasoning for sure, but net new, novel stuff, I think is still the domains of most.
It's true though that GPT may offset a lot of people to have to change and grow, and it might be hardest for the people who got away with BSing. I've never had that luxury and always have had to deliver so I guess it feels a little less frightening.
Either way, I figure its easier to learn which way the currents in this realms are flowing to recognize them better, than to sit on the side of the pool looking at it with disdain.
We stop growing not only when we stop learning, but when we stop creating.
That headline is misleading (and should probably be updated. 25x more energy than a search, not 25x more than the company. Which anyone who knows anything about the two things could tell you - one is running inference, the other is a database lookup. In fact, I'd be surprised if it's only 25x in that case.
Both companies will keep highy secret their energy use, so I doubt this article is based on anything more than a researchers guess.