I hate to break it to you, but the definition has already changed. It now means to build software exclusively using AI/LLM.
It's similar to the whole hacker/cracker debate. Words become defined by the one that has the most influence over the community and sometimes evolve on their own through places like social media.
at this point, those that use algorithm deserve the negative connotation. just like those that use synergy deserved to be mocked. when words are used just for the sake of using the word in word salad text, they quickly lose meaningful intent as the listeners hear them as shibboleths that the speaker is talking out of their arse.
Words are defined by consensus of the community using them. That's the primary source of semantic meaning.
Next one down is dictionary definition (or claim to authority, for example a tweet where the term was first used). But community meaning takes precedence.
Authors are free to use a nonstandard meaning but should provide readers with their definition if they want to be understood.
This is the same as saying that words have no meaning! Under this mental framework why would it be wrong to say that every living human can speak fluent french?
How would you even know which language anyone is speaking?
Counterproposal: words are a tool for communication and meaning is something we gather from the communication. In this words are no different than hand gestures, facial expressions, and body language.
The parties to a communication can only communicate effectively if they agree enough on the meaning of words/gestures/expressions/actions (which is why we cannot speak a language we do not know)
Can anyone recommend a video that's a good representation of "vibe coding"? I'd like to get a better sense of what the actual moment-to-moment of it looks like.
LLMs have been so spectacularly useless the couple of times that I've tried to use them for programming, that I can't really wrap my head around what this must be.
I'm really struggling to understand it as well. I mean, sure if what you're doing is a website, then maybe you can get something that functions out of an LLM. I don't really do web development, so maybe they're better for that specific niche.
However, for most cases I've tried, I get wildly incorrect and completely non-functional results. When they do "function", the code uses dangerously incorrect techniques and gives the wrong answer in ways you wouldn't notice unless you were familiar with the problem.
Maybe it's because I work in scientific computing, and there just aren't as many examples of our typical day to day problems out there, but I'm struggling to see how this is possible today...
This is absolutely FASCINATING to me. This man is learning so much about "coding" implicitly without learning any Python syntax. How to iterate in smaller steps when a big step fails. What's an API? How to massage data from one source into a format usable by the next stage in the pipeline. Adding things you forgot on the first iteration. How to use the command line (type "python3" instead of python, using the up arrow to run the same thing over again).
My favorite comment so far (I haven't gotten to the end) paraphrased:
"I don't know what Swagger is, but let's just paste it in here."
Somehow he figured out that Swagger docs tell Cursor enough to figure out how to talk to this API. Which is exactly what Swagger is for!
Seems like the odd, formal syntax of programming languages is the major block for many people from doing software development. Because he is doing every other step a professional developer does when building an application.
“I'm trying to free your mind, Neo. But I can only show you the door. You're the one that has to walk through it. You have to let it all go.“
Not a dev but been “vibe coding” since chatgpt came out. The llms can write a book… if you try to accomplish it with a single prompt it’s trash. If you construct the book chapter by chapter it’s a lot better and more cohesive.
You don’t build the app with a single prompt - you build a function or file at a time in a modular, expandable format.
Hackers are comfortable working in the dark— navigate with a flashlight (some background knowledge, understanding on syntax, data structures, secure coding practices etc) and you can get where your going a lot quicker and can try out a lot of different routes you may not have seen or had an opportunity to explore otherwise- maybe stumble upon an Easter egg along the way.
You don’t necessarily need to spend hours reading the documentation on an unfamiliar library if you know how to get the AI to understand it, reinforce it with some examples and and use it- maybe in that process it expands your perspective or gives you an idea to incorporate into your production grade environment.
With how quickly things advance- it seems rapid prototyping would allow you to qualify what’s worth investing time in vs what’s not.
If you know about DAST, SAST and containers you can probably create a non total trash workflow for prototype qualifications and then pass to a more technically savvy specialized team member if warranted?
Exploratory data analysis doesn’t seem wholly dissimilar in value- never know when you’ll stumble across a good nugget to feature engineer if you aren’t actively mining and exploring.
“Vibe coding”==you’re getting the model to do what YOU want. Craft some nefarious things to understand how to hold the reins on the beast and that’s a decent starting point.
If the LLM is useless- learn up on NLP, word embeddings and BERT and fine tune one to your specific use case. Don’t use the same chat session to make every file- manage the memory and tokens strategically and use few-multi shot reinforcement learning to specialize the sessions knowledge.
Maybe things become a lot more bespoke and require less dependencies- less susceptible to supply chain attack. More variety could make your system less susceptible to automated attacks and make the pyramid of pain stronger.
If everyone reverse engineers the dependencies and builds most things in house with their own twist, maybe that enables more flexibility with custom encoding and makes it less intuitive for an attacker to analyze your tech stack and infer how it operates.
—surely over simplifying a few things and missing out on some production grade concepts but just grasping that the same thing that’s viewed as creating security gaps could also be used as a mechanism to close some if used efficiently and strategically.
-— it’s not competition to a dev, use it so you can learn more and do better
I'd be concerned purchasing a book from a "programmer" who claims to teach people how to code without code. Kinda sounds like an "author" who publishes books without writing books.
> I fear it may be too late for these authors and publishers to fix their embarrassing mistakes: they’ve already designed the cover art!
To the publishers it's not a mistake, it's just clever marketing. Consider which of these two jumps off that glossy cover and into the distracted eye of a Technical Program Manager most readily: AI-Assisted Programming, or Vibe Coding
Now consider whether either of those parties feels an obligation to help maintain coherence of the software community's technical discourse.
Lol, sigh. Author says the term was coined 84 days ago on February 6th 2025. Literally go to Google and search ‘“vibe coding” before:2025-02:01; I see posts from more than a year ago.
It's similar to the whole hacker/cracker debate. Words become defined by the one that has the most influence over the community and sometimes evolve on their own through places like social media.
https://news.ycombinator.com/item?id=1865063
The article acknowledges this.
Another unfortunate example is the increasingly negative connotation assigned to the word "algorithm".
bet
Deleted Comment
Deleted Comment
Ideas, ideas are much sturdier.
If you change the meaning of something too radically, it has a tendency to snap back.
Next one down is dictionary definition (or claim to authority, for example a tweet where the term was first used). But community meaning takes precedence.
Authors are free to use a nonstandard meaning but should provide readers with their definition if they want to be understood.
How would you even know which language anyone is speaking?
Counterproposal: words are a tool for communication and meaning is something we gather from the communication. In this words are no different than hand gestures, facial expressions, and body language.
The parties to a communication can only communicate effectively if they agree enough on the meaning of words/gestures/expressions/actions (which is why we cannot speak a language we do not know)
Dead Comment
LLMs have been so spectacularly useless the couple of times that I've tried to use them for programming, that I can't really wrap my head around what this must be.
However, for most cases I've tried, I get wildly incorrect and completely non-functional results. When they do "function", the code uses dangerously incorrect techniques and gives the wrong answer in ways you wouldn't notice unless you were familiar with the problem.
Maybe it's because I work in scientific computing, and there just aren't as many examples of our typical day to day problems out there, but I'm struggling to see how this is possible today...
My favorite comment so far (I haven't gotten to the end) paraphrased:
"I don't know what Swagger is, but let's just paste it in here."
Somehow he figured out that Swagger docs tell Cursor enough to figure out how to talk to this API. Which is exactly what Swagger is for!
Seems like the odd, formal syntax of programming languages is the major block for many people from doing software development. Because he is doing every other step a professional developer does when building an application.
Deleted Comment
Not a dev but been “vibe coding” since chatgpt came out. The llms can write a book… if you try to accomplish it with a single prompt it’s trash. If you construct the book chapter by chapter it’s a lot better and more cohesive.
You don’t build the app with a single prompt - you build a function or file at a time in a modular, expandable format.
Hackers are comfortable working in the dark— navigate with a flashlight (some background knowledge, understanding on syntax, data structures, secure coding practices etc) and you can get where your going a lot quicker and can try out a lot of different routes you may not have seen or had an opportunity to explore otherwise- maybe stumble upon an Easter egg along the way.
You don’t necessarily need to spend hours reading the documentation on an unfamiliar library if you know how to get the AI to understand it, reinforce it with some examples and and use it- maybe in that process it expands your perspective or gives you an idea to incorporate into your production grade environment.
With how quickly things advance- it seems rapid prototyping would allow you to qualify what’s worth investing time in vs what’s not.
If you know about DAST, SAST and containers you can probably create a non total trash workflow for prototype qualifications and then pass to a more technically savvy specialized team member if warranted?
Exploratory data analysis doesn’t seem wholly dissimilar in value- never know when you’ll stumble across a good nugget to feature engineer if you aren’t actively mining and exploring.
“Vibe coding”==you’re getting the model to do what YOU want. Craft some nefarious things to understand how to hold the reins on the beast and that’s a decent starting point.
If the LLM is useless- learn up on NLP, word embeddings and BERT and fine tune one to your specific use case. Don’t use the same chat session to make every file- manage the memory and tokens strategically and use few-multi shot reinforcement learning to specialize the sessions knowledge.
Maybe things become a lot more bespoke and require less dependencies- less susceptible to supply chain attack. More variety could make your system less susceptible to automated attacks and make the pyramid of pain stronger.
If everyone reverse engineers the dependencies and builds most things in house with their own twist, maybe that enables more flexibility with custom encoding and makes it less intuitive for an attacker to analyze your tech stack and infer how it operates.
—surely over simplifying a few things and missing out on some production grade concepts but just grasping that the same thing that’s viewed as creating security gaps could also be used as a mechanism to close some if used efficiently and strategically. -— it’s not competition to a dev, use it so you can learn more and do better
I'd be concerned purchasing a book from a "programmer" who claims to teach people how to code without code. Kinda sounds like an "author" who publishes books without writing books.
To the publishers it's not a mistake, it's just clever marketing. Consider which of these two jumps off that glossy cover and into the distracted eye of a Technical Program Manager most readily: AI-Assisted Programming, or Vibe Coding
Now consider whether either of those parties feels an obligation to help maintain coherence of the software community's technical discourse.
Below worked for me
intext:"vibe coding" before:2025/02/01
It's funny, I tried to search that exact keywords and this exact comment is on the top page