Definitely an interesting way to encourage whales to spend a lot of money quickly.
Definitely an interesting way to encourage whales to spend a lot of money quickly.
The one bone I'll throw it was that I was asking it to edit its own MCP configs. So maybe it got thoroughly confused?
I dunno what's going on, I'm going to give it the night. It makes no sense whatsoever.
It felt like it was at least back to opus 4.5 levels.
The one bone I'll throw it was that I was asking it to edit its own MCP configs. So maybe it got thoroughly confused?
I dunno what's going on, I'm going to give it the night. It makes no sense whatsoever.
Whereas OpenAI seems to be huge in the consumer market (where downward pressure on pricing make ads more likely). They're trying with Codex but all the other stuff (the legal, financial, tooling with cowork, etc) seem to be a lot more fleshed out on the Claude side.
So they can probably get away with this for a while. That's my guess though.
Memory comparison of AI coding CLIs (single session, idle):
| Tool | Footprint | Peak | Language |
|-------------|-----------|--------|---------------|
| Codex | 15 MB | 15 MB | Rust |
| OpenCode | 130 MB | 130 MB | Go |
| Claude Code | 360 MB | 746 MB | Node.js/React |
That's a 24x to 50x difference for tools that do the same thing: send text to an API.vmmap shows Claude Code reserves 32.8 GB virtual memory just for the V8 heap, has 45% malloc fragmentation, and a peak footprint of 746 MB that never gets released, classic leak pattern.
On my 16 GB Mac, a "normal" workload (2 Claude sessions + browser + terminal) pushes me into 9.5 GB swap within hours. My laptop genuinely runs slower with Claude Code than when I'm running local LLMs.
I get that shipping fast matters, but building a CLI with React and a full Node.js runtime is an architectural choice with consequences. Codex proves this can be done in 15 MB. Every Claude Code session costs me 360+ MB, and with MCP servers spawning per session, it multiplies fast.
This is just regular tech debt that happens from building something to $1bn in revenue as fast as you possibly can, optimize later.
They're optimizing now. I'm sure they'll have it under control in no time.
CC is an incredible product (so is codex but I use CC more). Yes, lately it's gotten bloated, but the value it provides makes it bearable until they fix it in short time.
Nobody is claiming neutrality or specific issues like corn subsidies which cross party lines.
Pulling the endorsement after it goes the wrong way isn’t neutral.
We have somehow normalized the idea that newspapers openly state their preference for a candidate. I expect that from Fox News or MSNBC. But not the Washington Post.
I’ve always found the idea of papers endorsing candidates so odd, Bezos or not.
> they refused to endorse a candidate.
> for them choosing not to endorse Harris
There was no "they" or "them" involved.
I don’t like the idea of a paper taking sides (even if, in this case, their endorsement aligned with my side).
It seems antithetical to the ideas of independent and non-partisan journalism.
For example, today, I had claude basically prune all merged branches from a repo that's had 8 years of commits in it. It found and deleted 420 branches that were merged but not deleted.
Deleting 420 branches at once is probably the kind of long tail workflow that was not worth optimizing in the past, right? But I'm sure devs are doing this sort of housekeeping often now, whereas in the past, we just never would've made the time to do so.