As I understand it, what Karpathy was referring to as "vibe coding" was some sort of flow state "just talk to the AI, never look back" thing. Don't look at the generated code, just feel the AGI vibes ...
It sounds absolutely horrific if you care even the tiniest bit about the quality of what you are building! Good for laughs and "AGI is here!" Twitter posts, maybe for home projects and throwaway scripts, but as a way of developing serious software ?!!!
I think part of the reason this has taken off (other than the cool sounding name) is because it came from Karpathy. The same idea from anyone less well known would have probably been shot down.
I've seen junior developers (and even not so junior), pre-AI, code in this kind of way - copy some code from someplace and just hack it until it works. Got a nasty core dump bug? - just reorder your source code until it goes away. At minimum in a corporate environment this way or working would get you talked to, if not put on a performance plan or worse!
I would never have it be put into production without any type of review though, it's more for "I vibe coded this cool app, take a look, maybe this can be something bigger..."
Servers that shouldn't be made public are made public, a cyber tale as old as time.
Maybe quantum compute would be significant enough of a computing leap to meaningfully move the needle again.
But I think we're not even on the path to creating AGI. We're creating software that replicate and remix human knowledge at a fixed point in time. And so it's a fixed target that you can't really exceed, which would itself already entail diminishing returns. Pair this with the fact that it's based on neural networks which also invariably reach a point of sharply diminishing returns in essentially every field they're used in, and you have something that looks much closer to what we're doing right now - where all competitors will eventually converge on something largely indistinguishable from each other, in terms of ability.
This doesn't really make sense outside computers. Since AI would be training itself, it needs to have the right answers, but as of now it doesn't really interact with the physical world. The most it could do is write code, and check things that have no room for interpretation, like speed, latency, percentage of errors, exceptions, etc.
But, what other fields would it do this in? How can it makes strives in biology, it can't dissect animals, it can't figure more out about plants that humans feed into the training data. Regarding math, math is human-defined. Humans said "addition does this", "this symbol means that", etc.
I just don't understand how AI could ever surpass anything human known before we live by the rules defined by us.
Who would you think is weirder, the person still obsessed with horse & buggies, or the person obsessed with cars?
Software engineering pays because companies want people to develop software. It pays so well because it's hard, but the coding portion is become easier. Vibe coding and AI is here to stay, the author can choose to ignore it and go preach to a dying field (specifically, writing code, not CS), or embrace it. We should be happy we no longer need to type away if and for loops 20 times and instead can focus on high level architecture.
Long before LLMs, I would talk about classes / functions / modules like "it then does this, decides the epsilon is too low, chops it up and adds it to the list".
The difference I guess it was only to a technical crowd and nobody would mistake this for anything it wasn't. Everybody know that "it" didn't "decide" anything.
With AI being so mainstream and the math being much more elusive than a simple if..then I guess it's just too easy to take this simple speaking convention at face value.
EDIT: some clarifications / wording
I don't think LLMs are sentient or any bullshit like that, but I do think people are too quick to write them off before really thinking about how a nn 'knows things' similar to how a human 'knows' things, it is trained and reacts to inputs and outputs. The body is just far more complex.
Think? What exactly did “it” think about?