Yes it does. It's kind of a fixed cost though, since we're going to feed and educate our youth anyway, unless Sam Altman would have those people to starve to death.
Has there ever been at time when a wealthy person would run their mouth like this without any fear of an angry mob tearing their limbs off their body? Maybe right before the French revolution?
This comparison only works if you assume scaling keeps paying off. Sara Hooker's research shows (https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5877662) compact models now outperform massive predecessors and scaling laws only predict pre-training loss, not downstream performance. If marginal returns on compute are falling (https://philippdubach.com/posts/the-most-expensive-assumptio...), "energy per query" hides the real problem, a trillion dollars of infrastructure built on the bet that they won't.
Dead Comment
I couldn't find much on training AI models. Apparently GPT-3 used 1.3 GWh for training. So maybe ~10 GWh for newer models?
So... let's stop training humans I guess.