Readit News logoReadit News
mitthrowaway2 · 23 days ago
Yes it does. It's kind of a fixed cost though, since we're going to feed and educate our youth anyway, unless Sam Altman would have those people to starve to death.
asacrowflies · 22 days ago
I think your almost on to something with how these people think...
lisp2240 · 22 days ago
Has there ever been at time when a wealthy person would run their mouth like this without any fear of an angry mob tearing their limbs off their body? Maybe right before the French revolution?
Gibbon1 · 22 days ago
I don't think my parents and grandparents spent their lives working towards a future where grifters like Altman could take everything for themselves.
b3ing · 22 days ago
In the Epstein files they talk about how to rid the world of poor people
harddrivereque · 22 days ago
Lie, rich people know that the only reason they are rich is because of poor people

Dead Comment

robbbed · 22 days ago
I get the feeling that this guy has never been punched in the mouth. Otherwise he might be more careful with what he says.
random_duck · 22 days ago
You can remove "in the mouth".
7777777phil · 23 days ago
This comparison only works if you assume scaling keeps paying off. Sara Hooker's research shows (https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5877662) compact models now outperform massive predecessors and scaling laws only predict pre-training loss, not downstream performance. If marginal returns on compute are falling (https://philippdubach.com/posts/the-most-expensive-assumptio...), "energy per query" hides the real problem, a trillion dollars of infrastructure built on the bet that they won't.
p0w3n3d · 23 days ago
You are the carbon they want to reduce
TrackerFF · 22 days ago
I asked ChatGPT to do some napkin math, and it seems like on average it would take a human 13.75 million kcal worth of food, in those 20 years.
eulgro · 22 days ago
That's 58 GWh, but considering each food calorie actually require 5-10 calories of input energy (oil mostly), let's say 290 GWh.

I couldn't find much on training AI models. Apparently GPT-3 used 1.3 GWh for training. So maybe ~10 GWh for newer models?

So... let's stop training humans I guess.

cinnamonteal · 22 days ago
13.75 million kcal is 0.01598 GWh, not 58 GWh. So with the 5x multiplier, that's 0.08 GWh for a human.
zipping1549 · 22 days ago
If I were one of his family, I'd advise him to hire someone to stop himself from saying things like this.
brnt · 22 days ago
Training an LLM takes petabytes of theft.