Sad to see GPT-4.5 being gone. It knew things. More than any other model I'm aware of.
The issue isn’t cognitive overhead, it’s not having rituals to review and refine your thoughts. Everyone has to jot down ideas from time to time, but if you never take time to stop, review, and organize your thoughts then sure it’ll feel like a lot of cognitive overhead.
He also managed to do quite a lot of other things: https://en.wikipedia.org/wiki/Andrej_Karpathy
Would be a fun look, with the added bonus of some colleagues potentially being tricked into thinking they have an opportunity to mess with my machine. :D
So the complexity becomes something like 1 + 2 + 3 + .. + log n = O(log^2 n).