Readit News logoReadit News
Posted by u/johnwheeler 2 months ago
Ask HN: Are you afraid of AI making you unemployable within the next few years?
On Hacker News and Twitter, the consensus view is that no one is afraid. People concede that junior engineers and grad students might be the most affected. But, they still seem to hold on to their situations as being sustainable. My question is, is this just a part of wishful thinking and human nature, trying to combat the inevitable? The reason I ask is because I seriously don't see a future where there's a bunch of programmers anymore. I see mass unemployment for programmers. People are in denial, and all of these claims that the AI can't write code without making mistakes are no longer valid once an AI is released potentially overnight, that writes flawless code. Claude 4.5 is a good example. I just really don't see any valid arguments that the technology is not going to get to a point where it makes the job irrelevant, not irrelevant, but completely changes the economics.
chunkmonke99 · a month ago
I don't disagree, writing code will probably no longer be a thing in the near future. This is probably also true for all knowledge work (math, design, etc etc.) which is literally anything that can be "reduced" to mechanical transformations on symbols. Including music gear design, design of plumbing fixtures, tooling jigs (CAD work) etc etc. It is all basically transforming a specific set of discrete symbols into other ones or stringing them together or re-combining them etc etc. I wouldn't call that a doomer take either. But yes, the "Claude 4.5 still makes mistakes" thing is played out "remainder humanism"/ "John Henry vs. the machine". I fully see the value of Software products to go to zero with a whole bunch of money being funneled into one of the AI companies. It is a scary time to be around. I would stop learning coding and/or any framework or specific technology.
uberman · 2 months ago
I use Claude 4.5 almost every day. It makes mistakes every day. The worst mistakes are the ones that are not obvious and only by careful review do you see the flaws. At the moment, even the best AI cant be reliable event to make modest refactoring. What AI does at the moment is make senior developers worth more and junior developers worth less. I am not at all worried about my own job.
johnwheeler · 2 months ago
Thank you for your response. This is exactly the type of commentary I'm talking about. The key phrase is "at the moment." It's not that developers will be replaced, but there will be far less need for developers, is what I think.

I think the flaws are going to be solved for, and if that happens, what do you think? I do believe there needs to be a human in the loop, but I don't think there needs to be humans, plural. Eventually.

I believe this is denial. The statement that the best AI can't be reliable enough to do a modest refactoring is not correct. Yes, it can. What it currently cannot do is write a full app from start to finish, but they're working on longer task execution. And this is before any of the big data centers have even been built. What happens then? You get the naysayers that say, "Well, the scaling laws don't apply," but there's a lot of people who think they do apply.

ThrowawayR2 · 2 months ago
If anybody who disagrees with your assessment is "in denial (sic)", why should people bother responding to your question seriously?
uberman · 2 months ago
The best AI (and I do believe that Claude is one of the best) is able to hold a conversation, maintain context, and respond to simple requests. The key is understanding that not every dev know what questions to ask or when the answers are bad. Call it delusional if you like, but I don't see that changing any time soon if ever.
cjs_ac · 2 months ago
The AI providers' operations remain heavily subsidised by venture capital. Eventually those investors will turn around and demand a return on their investment. The big question is, when that happens, whether LLMs will be useful enough to customers to justify paying the full cost of developing and operating them.

That said, in the meantime, I'm not confident that I'd be able to find another job if I lost my current one, because I not only have to compete against every other candidate, I also need to compete against the ethereal promise of what AI might bring in the near future.

raw_anon_1111 · 2 months ago
Google has one of the best models, its own hardware and doesn’t depend on venture capital. Between its own products and GCP, they will be fine. The same with Amazon and Microsoft.

I just don’t see OpenAI being long term viable

wrxd · 2 months ago
As much as I would like my job to be exclusively about writing code, the reality is that the majority of it is:

- talking to people to understand how to leverage their platform and to get them to build what I need

- work in closed source codebases. I know where the traps and the foot guns are. Claude doesn’t

- telling people no, that’s a bad idea. Don’t do that. This is often more useful than an you’re absolutely right followed by the perfect solution to the wrong problem

In short, I can think and I can learn. LLMs can’t.

SonOfKyuss · 2 months ago
> telling people no, that’s a bad idea. Don’t do that. This is often more useful than an you’re absolutely right followed by the perfect solution to the wrong problem

This one is huge. I’ve personally witnessed many situations where a multi-million dollar mistake was avoided by a domain expert shutting down a bad idea. Good leadership recognizes this value. Bad leadership just looks at how much code you ship

Oras · 2 months ago
If some people are going to do whatever they want regardless, then it doesn’t matter if advise is coming from human expert or AI
Oras · 2 months ago
Well, with things like skills and proper memory, these things can become better. Remember 2 years ago when AI coding wasn’t even a thing?

You’re right it wouldn’t replace everyone, but businesses will need less people to maintain.

johnwheeler · 2 months ago
right, I think in the near term, the worry isn't about replacing people wholesale but just replacing most or more people and causing serious economic disruption. In the limit, you would have a CEO who commands the AI to do everything, but that seems less plausible
raw_anon_1111 · 2 months ago
There are two competing issues. AI and commoditization. AI is just making the problem worse.

I predicted commoditization happening back in 2016 when I saw no matter what I learned, it was going to be impossible to stand out from the crowd on the enterprise dev side of the market or demand decent top of market raises.[1]

I knew back then that the answer was going to be filling in the gaps with soft skills, managing larger more complex problems, being closer to determining business outcomes, etc.

I pivoted into customer facing cloud consulting specializing in application development (“application modernization”). No I am not saying “learn cloud”.

But focusing on the commodization angle. When I was looking for a job in late 2023, after being Amazoned, as a Plan B, I submitted literally 100s of applications. Each open req had hundreds of applicants and my application let alone resume was viewed maybe 5 times (LinkedIn shows you).

My plan A of using my network and targeted outreach did result in 3 offers within three weeks.

The same pattern emerged in 2024 when I was out looking again.

I’m in the interviewer pool at my current company, our submitting an application to job offer rate is 0.4%.

[1] I am referring to the enterprise dev market where most developers in the US work

mikewarot · 2 months ago
Having been yeeted out of the labor market by long covid, my worries about my own employment are settled.

However, that worry is replaced by the fear that so many people could lose their jobs that a consequence could be a complete collapse of the social safety net that is my only income source.

austin-cheney · 2 months ago
I am in management of enterprise API development. AI might replace coders but it won’t eliminate people who can work between teams and make firm decisions that drive complex projects forward. Many developers appear to struggle with this and when completely lost they look to waste effort on building SOPs instead of just formulating an original product.

Before this I was a JavaScript developer. I can absolutely see AI replacing most JavaScript developers. It felt really autistic with most people completely terrified to write original code. Everything had to be a React template with a ton of copy/paste. Watch the emotional apocalypse when you take React away.

johnwheeler · 2 months ago
When I started there used to be database analysts and server administrators. There still are but they're in far fewer supply because developers have mostly taken on those roles.

And I think you're right. Cross-function is super important. That's why I think the next consolidation is going to roll up to product development. Basically the product developers that can use AI and manage the full stack are going to be successful but I don't know how long that will last.

What's even more unsettling to me is it's probably going to end up being completely different in a way that nobody can predict. Your predictions, my predictions might be completely wrong, which is par for the course.

muzani · 2 months ago
If anything, I feel it makes my career more secure.

1) Nearly all the job losses I've dealt with was when a company runs low on money. This is because it cost too much/too long to build a product or get it into market.

2) LLMs are in the sweet spot of doing the things I don't want to do (writing flawless algorithms from known patterns, sifting through 2000-line logs) and not doing the sweet spot of doing what I'm good at (business cases, feature prioritization, juice). Engineering work now involves more fact checking and "data sheet reading" than it used to, which I'm happy to do.

3) Should programming jobs be killed, there will be more things to sell. And more roles for business/product owners. I'm not at all opposed to selling the things that the AI is making.

4) Also Gustafson's Law. All the cloud stuff led to things like Facebook and Twitch, which created a ton more jobs. I don't believe we'll see things like "vibe code fixer". But we'll probably see things like robotics running on a low latency LLM brain which unlocks a different host of engineering challenges. In 10 years, it might be the norm to create household bots and people might be coding apps based on how they vacuum the house and wipe the windows.

5) I don't take a high salary. The buffer between company profit and my costs is big enough that they don't feel the need to squeeze every drop out of me. They make more profit paying both me and the AI and the colleagues than they would just paying the AI.