Clawd.bot really annoyed me at first. The setup is super tedious and broken and not fun. That’s mostly because I’m too impatient to tinker like I used to.
However, once you tinker, it’s so-so. I don’t think it’s a lot better than Claude Code or anything, but I think it’s just a focused vector for the same AI model, one focused on being your personal assistant. It’s like Claude Code vs. Claude Cowork. They’re the same thing. But given the low cost of creating custom tools, why not give people something that Clawd.bot that gives them focused guardrails?
Anyway, I could end up abandoning all of this too. And it’s all a kludge around things that should really be an API. But I do like that I can run it on my Mac Mini and have it control my desktop. It’ll be a cold day if I let it message for me; I’d rather it write deterministic code that does that, rather than do it directly.
Claude-code is closed-source. That is a good enough reason to look at alternatives.
with 32gb RAM:
qwen3-coder and glm 4.7 flash are both impressive 30b parameter models
not on the level of gpt 5.2 codex but small enough to run locally (w/ 32gb RAM 4bit quantized) and quite capable
but it is just a matter of time I think until we get quite capable coding models that will be able to run with less RAM
Current test version runs in 8GB @ 60tks. Lmk if you want to join our early tester group!