On Hacker News and Twitter, the consensus view is that no one is afraid. People concede that junior engineers and grad students might be the most affected. But, they still seem to hold on to their situations as being sustainable. My question is, is this just a part of wishful thinking and human nature, trying to combat the inevitable? The reason I ask is because I seriously don't see a future where there's a bunch of programmers anymore. I see mass unemployment for programmers. People are in denial, and all of these claims that the AI can't write code without making mistakes are no longer valid once an AI is released potentially overnight, that writes flawless code. Claude 4.5 is a good example. I just really don't see any valid arguments that the technology is not going to get to a point where it makes the job irrelevant, not irrelevant, but completely changes the economics.
I think the flaws are going to be solved for, and if that happens, what do you think? I do believe there needs to be a human in the loop, but I don't think there needs to be humans, plural. Eventually.
I believe this is denial. The statement that the best AI can't be reliable enough to do a modest refactoring is not correct. Yes, it can. What it currently cannot do is write a full app from start to finish, but they're working on longer task execution. And this is before any of the big data centers have even been built. What happens then? You get the naysayers that say, "Well, the scaling laws don't apply," but there's a lot of people who think they do apply.
That said, in the meantime, I'm not confident that I'd be able to find another job if I lost my current one, because I not only have to compete against every other candidate, I also need to compete against the ethereal promise of what AI might bring in the near future.
I just don’t see OpenAI being long term viable
- talking to people to understand how to leverage their platform and to get them to build what I need
- work in closed source codebases. I know where the traps and the foot guns are. Claude doesn’t
- telling people no, that’s a bad idea. Don’t do that. This is often more useful than an you’re absolutely right followed by the perfect solution to the wrong problem
In short, I can think and I can learn. LLMs can’t.
This one is huge. I’ve personally witnessed many situations where a multi-million dollar mistake was avoided by a domain expert shutting down a bad idea. Good leadership recognizes this value. Bad leadership just looks at how much code you ship
You’re right it wouldn’t replace everyone, but businesses will need less people to maintain.
I predicted commoditization happening back in 2016 when I saw no matter what I learned, it was going to be impossible to stand out from the crowd on the enterprise dev side of the market or demand decent top of market raises.[1]
I knew back then that the answer was going to be filling in the gaps with soft skills, managing larger more complex problems, being closer to determining business outcomes, etc.
I pivoted into customer facing cloud consulting specializing in application development (“application modernization”). No I am not saying “learn cloud”.
But focusing on the commodization angle. When I was looking for a job in late 2023, after being Amazoned, as a Plan B, I submitted literally 100s of applications. Each open req had hundreds of applicants and my application let alone resume was viewed maybe 5 times (LinkedIn shows you).
My plan A of using my network and targeted outreach did result in 3 offers within three weeks.
The same pattern emerged in 2024 when I was out looking again.
I’m in the interviewer pool at my current company, our submitting an application to job offer rate is 0.4%.
[1] I am referring to the enterprise dev market where most developers in the US work
However, that worry is replaced by the fear that so many people could lose their jobs that a consequence could be a complete collapse of the social safety net that is my only income source.
Before this I was a JavaScript developer. I can absolutely see AI replacing most JavaScript developers. It felt really autistic with most people completely terrified to write original code. Everything had to be a React template with a ton of copy/paste. Watch the emotional apocalypse when you take React away.
And I think you're right. Cross-function is super important. That's why I think the next consolidation is going to roll up to product development. Basically the product developers that can use AI and manage the full stack are going to be successful but I don't know how long that will last.
What's even more unsettling to me is it's probably going to end up being completely different in a way that nobody can predict. Your predictions, my predictions might be completely wrong, which is par for the course.
1) Nearly all the job losses I've dealt with was when a company runs low on money. This is because it cost too much/too long to build a product or get it into market.
2) LLMs are in the sweet spot of doing the things I don't want to do (writing flawless algorithms from known patterns, sifting through 2000-line logs) and not doing the sweet spot of doing what I'm good at (business cases, feature prioritization, juice). Engineering work now involves more fact checking and "data sheet reading" than it used to, which I'm happy to do.
3) Should programming jobs be killed, there will be more things to sell. And more roles for business/product owners. I'm not at all opposed to selling the things that the AI is making.
4) Also Gustafson's Law. All the cloud stuff led to things like Facebook and Twitch, which created a ton more jobs. I don't believe we'll see things like "vibe code fixer". But we'll probably see things like robotics running on a low latency LLM brain which unlocks a different host of engineering challenges. In 10 years, it might be the norm to create household bots and people might be coding apps based on how they vacuum the house and wipe the windows.
5) I don't take a high salary. The buffer between company profit and my costs is big enough that they don't feel the need to squeeze every drop out of me. They make more profit paying both me and the AI and the colleagues than they would just paying the AI.