Imagine being paid generational wealth, and then the house of cards comes crashing down a couple of months later.
I'm honestly surprised that they're so similar. I've thought of LLM queries as being far more energy-intense than "just" a Google search, but maybe the takeaway is that ordinary Google searching is also quite energy-intense.
If I as a user just wanted an answer to a dumb question like, say, the meaning of some genZ slang, it seems about an order of magnitude to ask a small LLM running on my phone than to make a google search.
(Check my math: assuming the A16 CPU draws 5 watts peak for 20sec running Gemma or whatever on my iPhone, that’s 0.03Wh to answer a simple query, which is 10x cheaper)
Are training costs (esp. from failed runs) amortized in these estimates?
1: https://googleblog.blogspot.com/2009/01/powering-google-sear...
Finally, the c-suite is getting it.
"It just means that each of us has to get more in tune with what our customers need and what the actual end thing is that we're going to try to go build, because that's going to be more and more of what the work is as opposed to sitting down and actually writing code...."
https://www.businessinsider.com/aws-ceo-developers-stop-codi...
If you read the full remarks they're consistent with what he says here. He says "writing code" may be a skill that's less useful, which is why it's important to hire junior devs and teach them how to learn so they learn the skills that are useful.
Seems like eventually the law will get some poor girl killed when the authorities contact her parents about "CSAM," discover that it was the girl herself who took the picture and sent it to her boyfriend, her dad finds out she was having sex and does an honor killing.
But we're just supposed to trust that these image hashes have a small false positive rate, when there's no way to have transparent review without making it easy for adversaries to avoid the scan.
If literacy were widespread, why did only colonial writers write about them?