I really struggled to effectively cut onions until this: https://www.youtube.com/watch?v=CwRttSfnfcc
Haven't looked back since.
Anthropic exposes reasoning, which has become a big reason to use them for reasoning tasks over the other two despite their pricing. Rather ironic when the other two have been pushing reasoning much harder.
Well, partly because they (all but X, IIRC) have commitments to shift to carbon-neutral energy.
But also, from the article:
> ChatGPT is now estimated to be the fifth-most visited website in the world
That's ChatGPT today. They're looking ahead to 100x-ing (or 1,000,000x-ing) the usage as AI replaces more and more existing work.
I can run Llama 3 on my laptop, and we can measure the energy usage of my laptop--it maxes out at around 0.1 toasters. o3 is presumably a bit more energy intensive, but the reason it's using a lot of power is the >100MM daily users, not that a single user uses a lot of energy for a simple chat.
This seems like a classic tragedy of the commons, no? An individual has a minor impact, but the rationale switching to LLM tools by the collective will likely have a massive impact.
I'm a decent engineer working as a DS in a consulting firm. In my last two projects, I checked in (or corrected) so much more code than the other two junior DS's in my team, that at the end some 80%-90% of the ML-related stuff had been directly built, corrected or optimized by me. And most of the rest that wasn't, was mostly because it was boilerplate. LLMs were pivotal in this.
And I am only a moderately skilled engineer. I can easily see somebody with more experience and skills doing this to me, and making me nearly redundant.