Readit News logoReadit News
alexgotoi · 3 days ago
The thing people miss in these “replace juniors with AI” takes is that juniors were never mainly about cheap hands on keyboards. They’re the only people in the org who are still allowed to ask “dumb” questions without losing face, and those questions are often the only signal you get that your abstractions are nonsense.

What AI does is remove a bunch of the humiliating, boring parts of being junior: hunting for the right API by cargo-culting Stack Overflow, grinding through boilerplate, getting stuck for hours on a missing import. If a half-decent model can collapse that search space for them, you get to spend more of their ramp time on “here’s how our system actually fits together” instead of “here’s how for-loops work in our house style”.

If you take that setup and then decide “cool, now we don’t need juniors at all”, you’re basically saying you want a company with no memory and no farm system – just an ever-shrinking ring of seniors arguing about strategy while no one actually grows into them.

Always love to include a good AI x work thread in my https://hackernewsai.com/ newsletter.

rsanek · 3 days ago
"without losing face"? What culture are you referring to? The Western companies I have worked at do not discourage such questions -- in fact, it's often the sign of someone very senior when they ask a seemingly 'dumb' question that others have taken for granted.
socketcluster · 3 days ago
Yep, I fully agree with this view and I find that it's seniors who ask the 'dumb' questions. Everyone is worried about losing face in this precarious economy... But seniors are able to ask really smart questions as well so even their dumb questions sound smart... They can usually spin dumb questions into a smart questions by going one level deeper and bringing nuance into the discussion. This may be difficult to do for a junior.

My experience as a team lead working with a lot of juniors is that they are terrified of losing face and tend to talk a big game. As a team lead, I try to use language which expresses any doubts or knowledge gaps I have so that others in my team feel comfortable doing it as well. But a key aspect is that you have to really know your stuff in certain areas because you need to inspire others to mirror you... They won't try to mirror you if they don't respect you, based on your technical ability.

You need to demonstrate deep knowledge in some areas and need to demonstrate excellent reasoning abilities before you can safely ask dumb questions IMO. I try to find the specific strengths and weaknesses of my team members. I give constructive criticism for weaknesses but always try to identify and acknowledge each person's unique superpower; what makes them really stand out within the team. If people feel secure in their 'superpower', then they can be vulnerable in other areas and still feel confident. It's important to correctly identify the 'superpower' though because you don't want a Junior honing a skill that they don't naturally possess or you don't want them to be calling shots when they should be asking for help.

andrewmutz · 3 days ago
Completely agree with this. I got to work closely with an IBM fellow one summer and I was impressed by his willingness to ask "dumb questions" in meetings. Sometimes he was just out of the loop but more often he was just questioning some of the assumptions that others in the room had left unquestioned.
lvspiff · 3 days ago
100% agree and more credit if I could give it!

Even as a lead I ask the dumb question when no one else does just because when i can see the look in people faces or realize no one is chiming in the dumb question is needed to ensure everyone drives the point home. I've never been met with any sort of looking down upon nor do i discourage any of my staff - quite the opposite - I champion them for being willing to speak up.

darth_avocado · 3 days ago
It depends on the company and the people around you. At one company, my quarterly feedback was that I don’t ask too many questions in meetings, which was mostly due to the fact that the project was pretty straightforward and requirements were hammered down on a document. In another company, asking questions got me the feedback that I was maybe not experienced enough to manage the project by myself, which I was completely capable of. It’s a double edged sword.

But yes on a personal level, being senior enough in my career, I’d rather be thought of as less skilled by asking questions before the s hits the fan, than execute and mismanage a project that I didn’t ask enough questions on. The latter has more consequences tbh.

reactordev · 3 days ago
Company culture. Some companies I worked for would fire you for questioning decisions. Others, welcomed criticism. You don’t really know which environment you’re in until someone says something. Are you going to take the risk and be the first?
rukuu001 · 3 days ago
Think of highly competitive environments where looking foolish can be weaponised against you. They definitely exist here (my experience in UK and Australia)
baxtr · 3 days ago
I am bit more senior nowadays.

Whenever I don’t understand something I say something like: "Uh, I’m probably the only one here, but I don’t get it…"

xeromal · 3 days ago
Yup, my SR Director boss often says "I'm an idiot. Can you tell me what 'X' means when most of us probably wanted to know but were too afraid to ask
Spooky23 · 3 days ago
There are a lot of bad places to work, and those are the types of places that do things like replace junior devs with AI.

The place I work at is in the middle of a new CEO’s process of breaking the company. The company can’t go out of business, but we’ll set stuff on fire for another 12-18 months.

tracker1 · 3 days ago
That's been my experience as someone who tends to ask them regularly. I don't have a lot of hubris when it comes down to it, so I'll almost always ask. Especially acronyms or industry/insider business terms... I'll usually do a quick search in a browser, but if the results don't make sense, will simply ask.

Asking stupid questions almost goes hands in glove with "it's easier to ask forgiveness than permission." A lot of times, you're better off just doing something. Asking a simple question or making something happen saves a lot of grief more often than not. Understanding is important.

simsla · 3 days ago
I don't think that's the same. I spitball crazy ideas, but my core knowledge/expertise is sound, and I try not to talk out of my ass. (Or I am upfront when I'm outside my area of expertise. I think it's important to call that out once your word starts carrying some weight.)

A product manager can definitely say things that would make me lose a bit of respect for a fellow senior engineer.

I can also see how juniors have more leeway to weigh in on things they absolutely don't understand. Crazy ideas and constructive criticism is welcome from all corners, but at some level I also start expecting some more basic competence.

TeMPOraL · 2 days ago
There are kinds of questions that you can ask to signal your seniority and matureness. There are other kinds of questions that, should you ask them, will leave people wondering what the hell have you been doing for the past N years and why they're paying you senior-level salary.

A lot of early signs of problems, such as critical information becoming tribal knowledge instead of being documented, are revealed when asking the second kind of questions.

awesome_dude · 3 days ago
My Entire career

"Why the f*ck are you asking, you should know this"

or

"Why the f*ck can you not look that up"

edit: There's an entire chapter of the internet with "LMGTFY" responses because people ask questions.

or

"Isn't it f*cking obvious"

or

"Do I have to f*cking spell it out for you"

There's a strong chance that I am autistic, which means, yes, I need people to be (more) explicit.

AI has done a hell of a good job making it easier for me to search for subtexts that I typically miss. And I receive less of the negative feedback when I have something to ask that does help.

ben_w · 3 days ago
Company culture != national culture != personal culture. Such things can be all over the place.

I've worked with people from Korea who took me 100% seriously when I said the architecture was too complex and hard to work with slowing down velocity, and although they did end up keeping it there was no outward indication of lost face and they did put in the effort to justify their thinking.

I've also worked with some British, Greek, and Russian people who were completely unwilling to listen to any feedback from coworkers, only order them around.

Even within a person: I know a self-identified communist, who is very open minded about anything except politics.

tpoacher · 3 days ago
My cousin, when he got his first job, he managed to wipe the database clean on his first day at work. (classic, I know)

The seniors were very understanding, and more importantly it raised important questions about backups, dev vs prod pipelines, etc.

But you can bet my cousin was super embarrassed by it, and saving face was a big part of it.

mv4 · 3 days ago
I found this varies.

Meta? Ask questions anytime.

Amazon? Not so much.

Deleted Comment

doctaj · 3 days ago
There’s also the benefit of being naive - like, juniors can be seriously audacious when they haven’t been burned a million times. I miss having excitement and optimism.
Kiro · 2 days ago
It doesn't matter if the culture encourages it, you still don't want to ask dumb questions.
apercu · 3 days ago
Culture varies.

Deleted Comment

sharkweek · 2 days ago
Best VP I’ve ever had would stop meetings with regular frequency and say, “maybe I’m the dumbest person here, but I don’t understand [insert something being discussed], can you help me get a better understanding?”

It was anybody’s guess if they really didn’t understand the topic or if they were reading the room, but it was always appreciated.

kelipso · 3 days ago
I don’t know..this seems like one of those that is admired in HN but I don’t see in any of the multiple US companies that I’ve worked in. People are definitely concerned with looking dumb. “Losing face” may be something people here attribute to Oriental Cultures, but in practice it works similarly here too.
lanstin · 3 days ago
I have worked at a place where people were routinely criticized for asking basic questions on a big all-dev DL (which was archived and searchable, so they actually added to a growing record). The preferred solution was to ask a co-worker on the same team. People were answered a lot of questions were also criticized for being helpful. In neither case was the criticism that much from devs but from managers and given in boss feedback directly to people. Also it had a problem with spreading a good culture and common technical vision to new people, for some reason ( /s )
giancarlostoro · 3 days ago
> They’re the only people in the org who are still allowed to ask “dumb” questions without losing face

I strongly disagree, a Senior who cannot ask a "dumb question" is a useless developer to me. Ask away, we want to figure things out, we're not mind readers. Every good senior developer I know asks questions and admits when they don't know things. This is humility and in my eyes is a strong market for a good Senior Developer. The entire point of our job is to ask questions (to ourselves typically the most) and figure out the answers (the code).

Juniors could or should ask more questions, and by the time you're a Senior, you're asking key questions no matter how dumb they sound.

cloudfudge · 3 days ago
You're making the same point as the person you're responding to. They're saying seniors are allowed to ask dumb questions. It's junior who are often afraid to do so.
johnfn · 3 days ago
> The thing people miss in these “replace juniors with AI” takes is that juniors were never mainly about cheap hands on keyboards. They’re the only people in the org who are still allowed to ask “dumb” questions without losing face, and those questions are often the only signal you get that your abstractions are nonsense.

This seems almost entirely wrong to me? That anyone, at any level of seniority, can ask "dumb questions" and give signal about "nonsense abstractions" seems a property of any healthy organization. That only juniors can do this doesn't just seem wrong, it seems backwards. I would expect seniors to have the clearest idea on whether abstractions make sense, not juniors.

t43562 · 2 days ago
People who are new to the business should be able to challenge the assumptions that the business has built up over time and ceased to question.

They are the most insecure, however, no knowing who will be annoyed, shown up, embarassed by that question if it suggests that some past decisions were wrong.

tyre · 3 days ago
This is a really good and under appreciated point. My recommendation to mid-level, senior, and staff engineers is to keep questioning decisions and create a culture where that’s encouraged.

Junior devs do that naturally (if you have the culture) because they don’t know anything already. It’s great

47928485 · 3 days ago
> My recommendation to mid-level, senior, and staff engineers is to keep questioning decisions and create a culture where that’s encouraged.

Tell me you've never worked at FAANG without telling me you've never worked at FAANG...

ebiester · 3 days ago
So, I think there are two models.

One is a "one junior per team" model. I endorse this for exactly the reasons you speak.

Another, as I recently saw, was a 70/30 model of juniors to seniors. You make your seniors as task delegators and put all implementation on the junior developers. This puts an "up or out" pressure and gives very little mentorship opportunities. if 70% of your engineers are under 4 years of experience, it can be a rough go.

jorvi · 3 days ago
That second model is basically the hospital model.

You have 1 veteran doctor overseeing 4 learning doctors. For example operating rooms do this, where they will have 4 operating rooms with 4 less experienced anesthesist and then 1 very experienced anesthesist who will rotate between the 4 and is on call for when shit hits the fan.

Honestly I think everyone here is missing the forest for the trees. Juniors their main purpose isn't to "ask questions", it's to turn into capable seniors.

That's also why the whole "slash our junior headcount by 3/4th" we are seeing across the industry is going to massively, massively backfire. AI / LLMs are going to hit a wall (well, they already hit it a while ago), suddenly every is scrambling for seniors but there are none because no one wanted to bear the 'burden' of training juniors to be seniors. You thought dev salaries are insane now? Wait until 4-5 years from now.

tetha · 3 days ago
In my operational team, I'm following a third model, inspired by german trade workers. You have juniors, journeymen and masters. Juniors are generally clueless and need to be told what to do, specifically. This is very much the level of "Here are 28 marks that needs bolts placed in concrete, make it so, I can explain why". Journeymen should be figuring out a plan how to solve a project and challenge it with the master to see if it fits the quality requirements of the master.

And practically, you can have one or two journeymen per master. However, these 2-3 people can in turn support 3-4 more juniors to supply useful work.

This also establishes a fairly natural growth of a person within the company. First you do standard things as told. Then you start doing projects that mostly follow a standard that has worked in the past. And then you start standardizing projects.

lanstin · 3 days ago
My first big job was the 1 junior per team; those years were extremely good for learning how to design and write high performance services. Since then, I've mostly been at the 70/30 places where I'm considered senior. Occasionally I just sit down and blast out a big software project, just to feel I am still able, but mostly I tend the garden hoping that a few of the fragile stems will survive and grow into mighty oaks.
AdrianB1 · 3 days ago
With the subjective view on what a junior is, I think the 70-30 - or higher - model is used in any company I ever interacted with. For this evaluation I consider junior=someone with less skills than needed to do the job autonomously/require direction and supervision most time, senior=someone that can work autonomously.
citizenpaul · 3 days ago
The real thing people miss is not AI replacing Juniors. Its that senior management soured on hiring juniors even a few years before AI, almost across all industries.

AI is now just the scapegoat for an economy wide problem. Execs found "one neat trick", piling junior work on seniors until they quit. While not hiring replacements in order to goose short term profits. Now every company is in the same position where hiring a senior really means hiring 5 seniors to replace the one that had 5 jobs layered on over a few years. This is of course impossible for any mortal to jump into. Now they also dont even have juniors to train up to senior levels.

flyinglizard · 3 days ago
Good juniors are also great at just working. Usually no family so they are able to put in a lot of attention into work and they have that innocent curiosity and can-do in them which brings a lot of positive energy.
never_inline · 2 days ago
> put in a lot of attention into work and they have that innocent curiosity

They're also good at putting company code into ChatGPT.

/snark

sailfast · 3 days ago
I think the biggest challenge now becomes how more seasoned engineers teach juniors. The AI makes the ramp a lot easier but you still do best when you understand the whole stack and make mistakes.

It’s damned near impossible to figure out where to spend your time wisely to correct an assumption a human made vs. an AI on a blended pull request. All of the learning that happens during PR review is at risk in this way and I’m not sure where we will get it back yet. (Outside of an AI telling you - which, to be fair, there are some good review bots out there)

startupsfail · 3 days ago
Junior engineers now learn from AIs. And AIs now learn from RL cost functions. And RL cost functions are being set by PhDs, with little to no production grade engineering experience ;)

The result is interesting. First, juniors are miserable. What used to be a good experience coding and debugging, in a state of flow is now anxiously waiting if an AI could do it or not.

And senior devs are also miserable, getting apprentices used to be fun and profit, working with someone young is uplifting, and now it is gone.

The code quality is going down, Zen cycle interrupted, with the RL cost functions now at the top.

The only ones who are happy are hapless PhDs ;$

HeavyStorm · 3 days ago
I sense a lot of hate/baggage in this post subtext.

Really, juniors are only important because they ask "dumb" questions that can help remove useless abstractions? That your take?

hiddencost · 3 days ago
Yikes. Sounds like you work for a toxic company. The mid and senior level engineers I know all go out of their way to ask the dumb questions. Every junior employee I've mentored, I've told them the main way they can fail is not asking questions early and often. Gotta build a culture that supports questions.
frtime2025 · 3 days ago
AI will replace jobs. People are putting their IT/dev budget into something, which means something else will be cut.

I also don’t believe juniors, kids, seniors, staff, principals, distinguished/fellow should be replaced by AI. I think they WILL be, but they shouldn’t be. AI at Gemini 3 Flash / Claude Opus 4.5 level is capable with help and review of doing a lot of what a lot of devs do currently. It can’t do everything and will fail, but if the business doesn’t care, they’ll cut jobs.

Don’t waste time trying to argue against AI to attempt to save your job. Just learn AI and do your job until you’re no longer needed to even manage it. Or, if you don’t want to, do something else.

marcosdumay · 3 days ago
> People are putting their IT/dev budget into something, which means something else will be cut.

That's not how things work in normal times.

But normal times require minimally capable managers, a somewhat competitive economy, and some meritocracy in hiring. I can believe that's how things will work this time, but it's still a stupid way to do it.

Deleted Comment

numpad0 · 2 days ago
> you’re basically saying you want a company with no memory and no farm system – just an ever-shrinking ring of seniors arguing about strategy while no one actually grows into them.

Isn't that also the explicit aim of AI replacement, as stupid as it sounds? To me, the idea appear to be separation of money and work, so that economy will be strictly the concern of a metaphorical upper floor, floating in the thin air.

marcosdumay · 3 days ago
Yes, the most helpful things AIs do is to guide people into popular environments they don't know very well.

Or in other words, the people that get the most value from AI are junior devs, since they still don't know very well plenty of popular environments. It's also useful for seniors that are starting something in a new environment, but those only get 1 or 2 novel contexts at a time, while everything is new for a junior.

Or, in again another set of words, AI enable juniors to add more value more quickly. That makes them more valuable, not less.

TheGRS · 3 days ago
I don't agree that this is the central value juniors provide. Its a nice tertiary value, but not why one hires them. I think the value is the later part of farming for new talent and just growing your team.

I still think the central issue is the economy. There are more seniors available to fill roles, so filling out junior roles is less desirable. And perhaps "replacing juniors with AI" is just the industry's way of clumsily saving face.

PunchyHamster · 3 days ago
I don't think many orgs learn all that much from coaching their juniors, at least after first few.

Juniors are just... necessary in the balance, have to little of them and the mid and senior devs will get more and more expensive, so you hire a bunch of juniors, they get trained on job, and it balances it out.

Hell, if company does it right they might underpay junior-turned-senior for decade before they notice and look how industry pay looks like!

veunes · 2 days ago
I agree, the routine is gone, but at what cost? Understanding "how our system fits together" means solving problems, reading code, and debugging. If those fundamental skills aren't built through "humiliating and boring" tasks, how will a junior understand how the system actually works, not just how it appears to work?
Waterluvian · 3 days ago
That's a perplexing take based on how I've experienced the past 15 years: the more senior someone is, the more questions they tend to ask.
throwaway894345 · 2 days ago
I think the main benefit of junior devs is that it’s the only pipeline for getting senior devs. The other benefit over AI is that most of software engineering is not writing the code, but doing all of the other stuff like working out what to build, flagging concerns, operating the software once it is running, etc.
SecretDreams · 2 days ago
I'll also add the obvious answer in that real companies constantly have seniors leaving/retiring. Juniors are meant to be trained up to be the future seniors of the company. You should consistently feed and grow this pipeline unless you think you won't exist in the future or AI will replace all jobs, period.
never_inline · 2 days ago
If that was the case I would be chilling during my junior years.

Juniors are usually given either grunt or low priority work while seniors get more "important" work.

OTOH, it takes a lot to get your questions on RIGHT EARS when you're a junior, so wouldn't agree with your characterization at all.

int_19h · 2 days ago
It really depends on the workspace. Some places will give juniors serious work items specifically to grow them.
jappgar · 2 days ago
I generally agree with you but AI confusion is also a good signal your abstractions are nonsense.

One problem there is that people would rather believe the AI is "dumb" than face the facts.

epgui · 3 days ago
I’m a senior engineer on a staff track, I am proud to ask “dumb” questions all the time, and I don’t want to work somewhere where I don’t feel safe pursuing knowledge openly and candidly.
dejj · 3 days ago
Agree.

> cargo-culting Stack Overflow

What do you mean by this? I understand “cargo-culting” as building false idols, e.g. wooden headphones and runways to attract airplanes that never come.

kjellsbells · 3 days ago
It means to copy code or instructions from a site into your own project without having any comprehension of how or why it works.

example: you have a Windows problem. You search and read that "sfc /scannow" seems a popular answer to Windows problems. You run it without ever understanding what sfc does, whether the tool is relevant to your problem, etc. You are cargo culting a solution.

PaulStatezny · 3 days ago
I think the idea is copy-pasting code snippets from StackOverflow without comprehension of whether (and how) the code fixes the problem.

Deleted Comment

rafterydj · 2 days ago
plugging your AI newsletter at the end of your comment comes off as an indicator you want to farm engagement, not genuinely stimulate conversation.
jppope · 3 days ago
This is actually a super power I have after spending my first part of my career in sales.

I was never formally trained so I just keep asking "why" until someone proves it all the way. Sales itself is also a lot about asking questions that won't come up to find the heart of the thing people actually want... which is just another side of the coin.

aerhardt · 3 days ago
I mean, that’s an interesting take, but “having people around to ask dumb questions” is not why most orgs hire juniors.
lingrush4 · 3 days ago
In my experience, juniors are absolutely terrified of asking any sort of question at all during a meeting. Senior engineers are far more likely to ask interesting, useful questions.

We hire juniors so that we can offload easy but time-consuming work on them while we focus on more important or more difficult problems. We also expect that juniors will eventually gain the skills to solve the more difficult problems as a result of the experience they gain performing the easy tasks.

If we stop hiring juniors now, then we won't have any good senior engineers in 5-10 years.

protocolture · 2 days ago
>They’re the only people in the org who are still allowed to ask “dumb” questions without losing face

This is the only role of executives, sales people, account managers. They usually do it with complete and utter confidence too. Vibe-questioning and vibe-instructing other people without a care in the world.

Dead Comment

deepGem · 3 days ago
What will eventually pan out is that senior devs will be replaced with junior devs powered by AI assistants. Simply because of the reasons you stated. They will ask the dumb important questions and then after a while, will even solve for them.

Now that their minds are free from routine and boilerplate work, they will start asking more 'whys' which will be very good for the organization overall.

Take any product - nearly 50% of the features are unused and it's a genuine engineering waste to maintain those features. A junior dev spending 3 months on the code base with Claude code will figure out these hidden unwanted features, cull them or ask them to be culled.

It'll take a while to navigate the hierarchy but they'll figure it out. The old guard will have no option but to move up or move out.

throwway120385 · 3 days ago
Why would Claude code help you find unused features? The end customer uses features, not the AI. I would want to know from the end customer whether a feature is unused, and a Junior with an LLM assistant is not going to be able to tell me that without adding new features to the code base.
alwa · 3 days ago
How do you suppose the old guard are filling their days now?

At some level, aren’t you describing the age-old process of maturing from junior to mid level to senior in most lines of work, and in most organizations? Isn’t that what advancing in responsibility boils down to: developing subtlety and wisdom and political credibility and organizational context? Learning where the rakes are?

I wish 3 months, or even 3 years, were long enough to fully understand the whys and wherefores and politics of the organizations I cross paths with, and the jungle of systems and code supporting all the kinds of work that happen inside…

simonw · 3 days ago
Relevant post by Kent Beck from 12th Dec 2025: The Bet On Juniors Just Got Better https://tidyfirst.substack.com/p/the-bet-on-juniors-just-got...

> The juniors working this way compress their ramp dramatically. Tasks that used to take days take hours. Not because the AI does the work, but because the AI collapses the search space. Instead of spending three hours figuring out which API to use, they spend twenty minutes evaluating options the AI surfaced. The time freed this way isn’t invested in another unprofitable feature, though, it’s invested in learning. [...]

> If you’re an engineering manager thinking about hiring: The junior bet has gotten better. Not because juniors have changed, but because the genie, used well, accelerates learning.

beAbU · 3 days ago
Isn't the struggling with docs and learning how and where to find the answers part of the learning process?

I would argue a machine that short circuits the process of getting stuck in obtuse documentation is actually harmful long term...

chaos_emergent · 3 days ago
Isn't the struggle of sifting through a labyrinth of physical books and learning how and where to find the right answers part of the learning process?

I would argue a machine that short-circuits the process of getting stuck in obtuse books is actually harmful long term...

Aurornis · 3 days ago
I recall similar arguments being made against search engines: People who had built up a library of internal knowledge about where and how to find things didn't like that it had become so easy to search for resources.

The arguments were similar, too: What will you do if Google goes down? What if Google gives the wrong answer? What if you become dependent on Google? Yet I'm willing to bet that everyone reading this uses search engines as a tool to find what they need quickly on a daily basis.

rafaelmn · 3 days ago
No, trying stuff out is the valuable process. How I search for information changed (dramatically) in the last 20 years I've been programming. My intuition about how programs work is still relevant - you'll still see graybeards saying "there's a paper from 70s talking about that" for every "new" fad in programming, and they are usually right.

So if AI gets you iterating faster and testing your assumptions/hypothesis I would say that's a net win. If you're just begging it to solve the problem for you with different wording - then yeah you are reducing yourself to a shitty LLM proxy.

tencentshill · 3 days ago
The naturally curious will remain naturally curious and be rewarded for it, everyone else will always take the shortest path offered to complete the task.
marcosdumay · 3 days ago
> learning how and where to find the answers part of the learning process?

Yes. And now you can ask the AI where the docs are.

The struggling is not the goal. And rest assured there are plenty of other things to struggle with.

PaulKeeble · 3 days ago
The thing is you need both. You need to have periods where you are reading through the docs learning random things and just expanding your knowledge, but the time to do that is not when you are trying to work out how to get a string into the right byte format and saved in the database as a blob (or whatever it is). Documentation has always has lots of different uses and the one that gets you answers to direct questions has improved a bit but its not really reliable yet so you are still going to have to check it.
fireflash38 · 3 days ago
The problem isn't that AI makes obtuse documentation usable. It's that it makes good documentation unread.

There's a lot of good documentation where you learn more about the context of how or why something is done a certain way.

supersour · 3 days ago
I think if this were true, then individualized mastery learning wouldn't prove to be so effective

https://en.wikipedia.org/wiki/Mastery_learning

throwaway613745 · 3 days ago
The best part is when the AI just makes up the docs
pizza234 · 3 days ago
It really depends on what's being learned. For example, take writing scripts based on the AWS SDK. The APIs documentation is gigantic (and poorly designed, as it takes ages to load the documentation of each entry), and one uses only a tiny fraction of the APIs. I don't find "learning to find the right APIs" a valuable knowledge; rather, I find "learning to design a (small) program/script starting from a basic example" valuable, since I waste less time in menial tasks (ie. textual search).
Ifkaluva · 3 days ago
No :)

Any task has “core difficulty” and “incidental difficulty”. Struggling with docs is incidental difficulty, it’s a tax on energy and focus.

Your argument is an argument against the use of Google or StackOverflow.

tayo42 · 3 days ago
If the docs are poorly written then your not learning anything except how to control frustration
tikhonj · 3 days ago
Struggling with poorly organized docs seems entirely like incidental complexity to me. Good learning resources can be both faster and better pedagogically. (How good today's LLM-based chat tools are is a totally separate question.)
jimbokun · 3 days ago
Why?

If you can just get to the answer immediately, what’s the value of the struggle?

Research isn’t time coding. So it’s not making the developer less familiar with the code base she’s responsible for. Which is the usual worry with AI.

schainks · 3 days ago
Disagree. While documentation is often out of date, the threshold for maintaining it properly has been lowered, so your team should be doing everything it can to surface effective docs to devs and AIs looking for them. This, in turn, also lowers the barrier to writing good docs since your team's exposure to good docs increases.

If you read great books all the time, you will find yourself more skilled at identifying good versus bad writing.

jaapbadlands · 3 days ago
Feel free to waste your time sifting through a dozen wrong answers. Meanwhile the rest of us can get the answers, absorb the right information quickly then move on to solving more problems.
bigstrat2003 · 3 days ago
Yes, it is. And yes, it absolutely is harmful.
seanmcdirmid · 3 days ago
1965: learning how to punch your own punch cards is part of the learning process

1995: struggling with docs and learning how and where to find the answers part of the learning process

2005: struggling with stackoverflow and learning how to find answers to questions that others have asked before quickly is part of the learning process

2015: using search to find answers is part of the learning process

2025: using AI to get answers is part of the learning process

...

lokar · 3 days ago
For an experienced engineer, working out the syntax, APIs, type issues, understanding errors, etc is the easy part of the job. Larger picture issues are the real task.

But for many Jr engineers it’s the hard part. They are not (yet) expected to be responsible for the larger issues.

bdangubic · 3 days ago
what is a larger issue? lacking domain knowledge? or lacking deeper understanding of years of shit in the codebase that seniors may have better understanding? where I work, there is no issue that it "too large" for a junior to take on, it is the only way that "junior" becomes "non-junior" - by doing, not by delegating to so-called seniors (I am one of them)
atomicnumber3 · 3 days ago
This is honestly what I (staff engineer) find AI the most useful for. I've been around the block enough that I typically know in general what I want, but I often find myself wanting it in a new framework or paradigm or similar, and if I could just ASK a person a question, they'd understand it. But not knowing the exact right keywords, especially in frameworks with lots of jargon, can still make it annoying. I can often get what I want by just sitting down and reading approximately 6 screen-heights of text out of the official docs on the general topic in question to find the random sentence 70% of the way down that answered my question.

But dyou know what's really great at taking a bunch of tokens and then giving me a bunch of probabilistically adjacent tokens? Yeah exactly! So often even if the AI is giving me something totally bonkers semantically, just knowing all those tokens are adjacent enough gives me a big leg up in knowing how to phrase my next question, and of course sometimes the AI is also accidentally semantically correct too.

never_inline · 2 days ago
When I joined I could do all this.
ChuckMcM · 3 days ago
And this is always my question: "... because the genie, used well, accelerates learning." Does it though?

How are we defining "learning" here? The example I like to use is that a student who "learns" what a square root is, can calculate the square root of a number on a simple 4 function calculator (x, ÷, +, -) if iteratively. Whereas the student who "learns" that the √ key gives them the square root, is "stuck" when presented with a 4 function calculator. So did they 'learn' faster when the "genie" surfaced a key that gave them the answer? Or did they just become more dependent on the "genie" to do the work required of them?

pests · 3 days ago
Some random musings this reminded me of.

I graduated HS in mid 2000s and didn't start using a calculator for math classes until basically a junior in college. I would do every calculation by hand, on paper. I benefited from a great math teacher early on that taught me how to properly lay out my calculations and solutions on paper. I've had tests I've turned in where I spent more paper on a single question than others did on the entire test.

It really helped my understanding of numbers and how they interacted, and helped teachers/professors narrow down on my misunderstandings.

sailfast · 3 days ago
You still need to be curious. I learn a ton by asking questions of the LLMs when I see new things. “Explain this to me - I get X but why did you do Y?”

It’s diamond age and a half - you just need to continue to be curious and perhaps slow your shipping speed sometimes to make sure you budget time for learning as well.

almosthere · 3 days ago
We had 3 interns this past summer - with AI I would say they were VERY capable of generating results quickly. Some of the code and assumptions were not great, but it did help us push out some releases quickly to alleviate customer issues. So there is a tradeoff with juniors. May help quickly get features out, may also need some refactoring later.
turnsout · 3 days ago
Interesting how similar this is to the tradeoff of using AI coding agents
fastball · 3 days ago
What makes them more capable than a senior engineer with three LLM agents?
gnerd00 · 3 days ago
first response from me "let me mention how the real business world actually works" .. let's add a more nuanced slice to that however

Since desktop computers became popular, there have been thousands of small to mid-size companies that could benefit from software systems.. A thousand thousand "consultants" marched off to their nearest accountant, retailer, small manufacturer or attorney office, to show off the new desktop software and claim ability to make new, custom solutions.

We know now, this did not work out for a lot of small to mid-size business and/or consultants. Few could build a custom database application that is "good enough" .. not for lack of trying.. but pace of platforms, competitive features, stupid attention getting features.. all of that, outpaced small consultants .. the result is giant consolidation of basic Office software, not thousands of small systems custom built for small companies.

What now, in 2025? "junior" devs do what? design and build? no. Cookie-cutter procedures at AWS lock-in services far, far outpace small and interesting designs of software.. Automation of AWS actions is going to be very much in demand.. is that a "junior dev" ? or what?

This is a niche insight and not claiming to be the whole story.. but.. ps- insert your own story with "phones" instead of desktop software for another angle

mlloyd · 3 days ago
One thing I'd point out is that there are only so many ways to write a document or build a spreadsheet. There are a ton of business processes that are custom enough to that org that they have to decide to go custom, change their process, or deal with the inefficiency of not having a technical solution that accomplishes the goal easily.

Lotus Notes is an example of that custom software niche that took off and spawned a successful consulting ecosystem around it too.

nateglims · 3 days ago
I'm a little confused by this analysis. Are you saying that all enterprise software has been replaced with MS word and AWS?
sharemywin · 3 days ago
I think the big win with AI is being able to work around jargon. Don't know what that word means ask AI. what the history on it no problem. don't understand a concepts explain this at a high school reading level.
imiric · 3 days ago
I'm not swayed by appeals to authority, but this is a supremely bad take.

"AI" tools are most useful in the hands of experienced developers, not juniors. It's seniors who have the knowledge and capability to review the generated output, and decide whether the code will cause more issues when it's merged, or if it's usable if they tweak and adapt it in certain ways.

A junior developer has no such skills. Their only approach will be to run the code, test whether it fulfills the requirements, and, if they're thorough, try to understand and test it to the best of their abilities. Chances are that because they're pressured to deliver as quickly as possible to impress their colleagues and managers, they'll just accept whatever working solution the tool produces the first time.

This makes "AI" in the hands of junior developers risky and counterproductive. Companies that allow this type of development will quickly grind to a halt under the weight of technical debt, and a minefield of bugs they won't know how to maneuver around.

The unfortunate reality is that with "AI" there is no pathway for junior developers to become senior. Most people will gravitate towards using these tools as a crutch for quickly generating software, and not as a learning tool to improve their own skills. This should concern everyone vested in the future of this industry.

versteegen · 2 days ago
> A junior developer has no such skills. Their only approach will be to run the code, test whether it fulfills the requirements, and, if they're thorough, try to understand and test it to the best of their abilities.

This is also a supremely bad take... well, really it's mainly the way you worded it that's bad. Juniors have skills, natural aptitudes, as much intelligence on average as other programmers, and often even some experience but what they lack is work history. They sure as hell are capable of understanding code rather than just running it. Yes, of course experience is immensely useful, most especially at understanding how to achieve a maintainable and reliable codebase in the longterm, which is obviously of special importance, but long experience is not a hard requirement. You can reason about trade offs, learn from advice, learn quickly, etc.

golly_ned · 3 days ago
> Instead of spending three hours figuring out which API to use, they spend twenty minutes evaluating options the AI surfaced

This really isn't the case from what I've seen. It's that they use Cursor or other code generation tools integrated into their development environment to generate code, and if it's functional and looks from a fuzzy distance like 'good' code (in the 'code in the small' sense), they send an oversized PR, and it's up to the reviewer to actually do the thinking.

simonw · 2 days ago
That's bad and those juniors should be taught to do better or be "managed out of the company".

Their job is to deliver code that they have proved to work.

This inspired me to write a longer form version of this: Your job is to deliver code you have proven to work https://simonwillison.net/2025/Dec/18/code-we-have-proven-to...

rustystump · 3 days ago
This. I have seen MRs with generated open cv lut mapping in them because a junior didnt understand that what they needed was a simple interpolation function.

The crux is always that you dont know what u dont know. AI doesnt fix this.

lanfeust6 · 3 days ago
Search is easily the best feature of AI/LLMs.
alpha_squared · 3 days ago
I kind of agree here. The mental model that works for me is "search results passed through a rock tumbler". Search results without attribution and mixed-and-matched across reputable and non-reputable sources, with a bias toward whatever source type is more common.
sublinear · 3 days ago
That's arguably all it ever was. Generating content using AI is just finding a point in latent space.
gosub100 · 3 days ago
Which was trained on a pre-AI internet. What's going to happen in coming years when new tech comes out but perhaps isn't documented the same way anymore? It's not an unsolvable problem, but we could see unintended consequences like, say where you must pay the AI provider to ingest your data. Similar to buying poll space or AdSense or whatever they call it for search engines

Deleted Comment

xp84 · 3 days ago
> the genie, used well, accelerates learning.

Ehh... 'used well' is doing some very heavy lifting there. And the incentive structure at 90% of companies does not optimize for 'using it well.'

The incentive is to ship quickly, meaning aim the AI-gun at the codebase for a few hours and grind out a "technically working" solution, with zero large-scale architecture thought and zero built-up knowledge of how the different parts of the application are intended to work together (because there was no "intention"). There will be tests, but they may not be sensible and may be very brittle.

Anyway, deploying a bunch of fresh grads armed not with good mentorship but with the ability to generate thousands of LOC a day is a recipe for accelerating the collapse I usually see in startup codebases about 6-8 years old. This is the point where the list of exceptions to every supposed pattern is longer than the list of things that follow the patterns, and where each bug, when properly pursued, leads to a long chain of past bad decisions, each of which would take days of effort to properly unwind (and that unwinding will also have a branching effect on other things). Also, coincidentally, this is the point where an AI agent is the most useless, because they really don't expect all the bizarre quirks in the codebase.

Am I saying AI is useless? No, it's great for prototyping and getting to PMF, and great in the hands of someone who can read its output with a critical eye, but I wouldn't combine it with inexperienced users who haven't had the opportunity to learn from all the many mistakes I've made over the years.

SkyPuncher · 3 days ago
*Some juniors have gotten better.

I hate to be so negative, but one of the biggest problems junior engineers face is that they don't know how to make sense of or prioritize the gluttony of new-to-them information to make decisions. It's not helpful to have an AI reduce the search space because they still can't narrow down the last step effectively (or possibly independently).

There are junior engineers who seem to inherently have this skill. They might still be poor in finding all necessary information, but when they do, they can make the final, critical decision. Now, with AI, they've largely eliminated the search problem so they can focus more on the decision making.

The problem is it's extremely hard to identify who is what type. It's also something that senior level devs have generally figured out.

zahlman · 3 days ago
Not to disagree with Kent Beck's insights on juniors using AI, but the effect of AI on his own writing is palpably negative. His older content is much more enjoyable to read. And so is his recent non-post "activity" on Substack. For example, compare a "note" preceding this article (https://substack.com/@kentbeck/note/c-188541464), on the same topic, to the actual content.
GeoAtreides · 3 days ago
>but because the genie, used well, accelerates learning.

This is "the kids will use the AI to learn and understand" level of cope

no, the kids will copy and paste the solution then go back to their preferred dopamine dispenser

CuriouslyC · 3 days ago
I've learned a lot of shit while getting AI to give me the answers, because I wanted to understand why it did what it did. It saves me a lot of time trying to fix things that would have never worked, so I can just spend time analyzing success.

There might be value in learning from failure, but my guess is that there's more value in learning from success, and if the LLM doesn't need me to succeed my time is better spent pushing into territory where it fails so I can add real value.

switchbak · 3 days ago
Some might (most might?), those aren't the ones we're interested in.

Just as some might pull the answers from the back of the textbook, the interesting ones are the kids who want to find out why certain solutions are the way they are.

Then again I could be wrong, I try hard to stay away from the shithose that is the modern social media tech landscape (TikTok, Insta, and friends) so I'm probably WAY out of touch (and I prefer it that way).

simonw · 3 days ago
Right, and they won't get hired beyond their internship.
ivape · 3 days ago
Don’t confuse this with this persons ability to hide their instincts. He is redefining “senior” roles as junior, but words are meaningless in a world of numbers. The $$$ translation is that something that was worth $2 should now be worth $1.

Because that makes the most business sense.

amrocha · 3 days ago
I disagree. In my experience AI does most of the work and the juniors already poor skills atrophy. Then a senior engineer has to review AI slop and tell the junior to roll the AI dice again.
snarf21 · 3 days ago
Agreed, this is like AI doing your homework. A select few will use it to learn but most will copy/pasta, let it create their PR and slack the rest of the day. But at least they are "trying" so they won't get fired. And it requires strong senior engineers to walk them through the changes they are trying to check in and see why they chose them.
thinkingtoilet · 3 days ago
I've seen it go both ways. As usual, a good manager should be able to navigate this.
Yodel0914 · 3 days ago
I’m so sick of getting “but copilot said…” responses on PR comments.
ares623 · 3 days ago
The cynic in me sees it as using juniors as a vehicle for driving up AI metrics. The seniors will be less critical reviewing AI output with a human shield/messenger.
irishcoffee · 3 days ago
The amount of copium in the replies to this is just amazing. It’s amazing.
bgwalter · 3 days ago
How would a person who describes himself as a "full time content producer" know what is actually going on in the industry?

https://substack.com/@kentbeck

What software projects is he actively working on?

psunavy03 · 3 days ago
The dude literally invented Extreme Programming and was the first signer of the Agile Manifesto. He's forgotten more about software development than most people on this site ever knew.
helsinkiandrew · 3 days ago
orliesaurus · 3 days ago
Interesting take... I'm seeing a pattern... People think AI can do it all... BUT I see juniors often are the ones who actually understand AI tools better than seniors... That's what AWS CEO points out... He said juniors are usually the most experienced with AI tools, so cutting them makes no sense... He also mentioned they are usually the least expensive, so there's little cost saving... AND he warned that without a talent pipeline you break the future of your org... As someone who mentors juniors, I've seen them use AI to accelerate their learning... They ask the right questions, iterate quickly, and share what they find with the rest of us... Seniors rely on old workflows and sometimes struggle to adopt new tools... HOWEVER the AI isn't writing your culture or understanding your product context... You still need people who grow into that... So I'm not worried about AI replacing juniors... I'm more worried about companies killing their own future talent pipeline... Let the genies help, but don't throw away your apprentices.
orliesaurus · 3 days ago
ON TOP OF IT ALL, juniors are the ones who bring novel tools to the desk MOST times...i.e. I had no clue the Google IDE gave you free unlimited credits despite the terrible UI...but a young engineer told me about it!!
bluGill · 3 days ago
I've seen seniors and juniors bring novel tools in. Seniors do it less often perhaps - but only because we have seen this before under a different name and realize it isn't novel. (sometimes the time has finally come, sometimes it fail again for the same reason it failed last time)
al_borland · 3 days ago
I work at a place with lots of rules around what can and can’t be used. When someone new start we end up spending a lot of time policing to make sure they aren’t using stuff they should be.

A very basic example were the interns who constantly tried to use Google Docs for everything, their personal accounts no less. I had to stop them and point them back to MS Office at least a dozen times.

In other situations, people will try and use free tools that don’t scale well, because that’s what they used in college or as a hobby. It can take a lot of work to point them to the enterprise solution that is already approved and integrated with everything. A basic example of this would be someone using Ansible from their laptop when we have Ansible Automation Platform, which is better optimized for running jobs around the globe and automatically logs to Splunk to create an audit trail.

salawat · 3 days ago
I'm just shocked people aren't clueing into the fact that tech companies are trying to build developer dependence on these things to secure a "rent" revenue stream. But hey, what do I know. It's just cloud hyper scaling all over again. Don't buy and drive your own hardware. Rent ours! Look, we built the metering and everything!
CryptoBanker · 2 days ago
Are you talking about Antigravity, Firebase Studio, or something else?
codegeek · 3 days ago
"BUT I see juniors often are the ones who actually understand AI tools better than seniors"

Sorry, what does that mean exactly ? Are you claiming that a junior dev knows how to ask the right prompts better than a Senior dev ?

__s · 3 days ago
Their implication is that junior devs have more likely built up their workflow around the use of AI tooling, likely because if they're younger they'll have had more plasticity in their process to adapt AI tooling

Overall I don't quite agree. Personally this applies to me, I've been using vim for the last decade so any AI tooling that wants me to run some electron app is a non starter. But many of my senior peers coming from VS Code have no such barriers

francisofascii · 3 days ago
Speaking as a senior dev, anecdotally juniors may indeed understand AI tools better, because they spend more hours a day coding and working with the tools, and they need the tools to understand the codebase or to be productive. Seniors have more hours stuck in meetings, developing specs/tickets for the juniors, code reviewing, etc. Seniors are likely to not bother with a prompt for simple changes in codebases they already understand.
perfmode · 3 days ago
Some old dogs resist learning new tricks.
bongodongobob · 3 days ago
If AI is just prompts to you, you fall into the "don't know how to use it" group.
9rx · 3 days ago
> He said juniors are usually the most experienced with AI tools, so cutting them makes no sense.

While anyone is free to define words as they so please, most people consider those with the most experience to be seniors. I am pretty sure that has been the message around this all along: Do not cut the seniors. The label you choose isn't significant. Whether you want to call them juniors or seniors, it has always been considered to make no sense to cut those with the most experience.

dragonwriter · 3 days ago
No, he’s saying that juniors, while having less experience ind development in general have more experience with AI tools. (This may be true broadly, certainly less experienced devs generally, IME, seem more enthusiastic about adopting and relying heavily on AI tooling.)
whazor · 3 days ago
Amazon has an internal platform for building software. The workflows are documented and have checks and balances. So the CEO wants to have more junior developer that are more proficient with AI, and have (in ratio) less senior developers. Also, product context comes from (product) managers, UX designers.

For medium or small companies, these guardrails or documentation can be missing. In that case you need experienced people to help out.

yieldcrv · 3 days ago
you're right but my opinion about this has changed

I would have agreed with you 100% one year ago. Basically senior engineers are too complacent to look at AI tools as well as ego driven about it, all while corporate policy disincentivizes them from using anything at all, with maybe a forced Co-Pilot subscription. While junior engineers will take a risk that the corporate monitoring of cloud AI tools isn't that robust.

But now, although many of those organizations are still the same - with more contrived Co-Pilot subscriptions - I think senior engineers are skirting corporate policy too and become more familiar with tools.

I'm also currently in an organization that is a total free for all with as many AI coding and usage tools as necessary to deliver faster. So I could be out of touch already.

Perhaps more complacent firms are the same as they were a year ago.

WestCoader · 3 days ago
Sorry but what the heck is up with all the ellipses in this comment?
Mountain_Skies · 3 days ago
It's a sort of stream of consciousness. That style of writing goes in and out of style from time to time but some people use it consistently.
raincole · 3 days ago
They have an emacs package that triples their . automatically!

Dead Comment

debo_ · 3 days ago
They're trying really hard to make sure you know they didn't write their post with an LLM? /s
JKCalhoun · 3 days ago
> I'm seeing a pattern...

Me too. Fire your senior devs. (Ha ha, not ha ha.)

tech_tuna · a day ago
I love that the dominant narrative with modern AI is "figure out who we can fire" pronto. I don't see a clear pattnern with juniors and seniors and AI. I know some younger engineers who are not embracing AI tools at all.

I'd say that AI tools make good engineers better and more productive and makes bad engineers appear to be more productive but ultimately makes them shoot themselves in the foot more thoroughly and quickly, while also piling up more work for everyone else.

Ancalagon · 3 days ago
No no, fire them.

Cannot wait for the 'Oh dear god everything is on fire, where is the senior dev?' return pay packages.

kakacik · 3 days ago
Maybe, but you make it sound like juniors are more worthy to companies than seniors. Then fire most/all seniors and good luck with resulting situation.

Coding in any sufficiently large organization is never the main part of senior's time spend, unless its some code sweatshop. Juniors can do little to no of all that remaining glue that makes projects go from a quick brainstorming meeting to live well functioning and supported product.

So as for worth - companies can, in non-idedal fashion obviously, work without juniors. I can't imagine them working without seniors, unless its some sort of quick churn of CRUDs or eshops from some templates.

Also there is this little topic that resonates recently across various research - knowledge gained fast via llms is a shallow one, doesn't last that long and doesn't go deeper. One example out of many - any time I had to do some more sophisticated regex-based processing I dived deep into specs, implementation etc. and few times pushed it to the limits (or realized task is beyond what regex can do), instead of just given the result, copypasted it and moved along since some basic test succeeded. Spread this approach across many other complex topics. That's also a view on long term future of companies.

I get what you say and I agree partially but its a double edged sword.

lvl155 · 3 days ago
7/10 senior devs (usually fellas 50+) will get mad at you for trying to use Claude Code. Me: “dude it writes better code than crap you write in your mush middle-age brain.” Also me: “I also have mush brain.”

I think LLM is a reflection of human intelligence. If we humans become dumber as a result of LLM, LLMs will also become dumber. I’d like to think in some dystopian world, LLM’s trained from pre 2023 data will be sought after.

thunky · 3 days ago
> 7/10 senior devs (usually fellas 50+) will get mad at you for trying to use Claude Code

Ironic because the junior has much more to lose. The 50+ can probably coast across the finish line.

ch2026 · 3 days ago
is this just a janky summary cause you added zero new viewpoints
pnathan · 3 days ago
I - senior - can patch an application in an unknown language and framework with the AI. I know enough to tell it to stop the wildly stupid ideas.

But I don't learn. That's not what I'm trying to do- I'm trying to fix the bug. Hmm.

I'm pretty sure AI is going to lead us to a deskilling crash.

Food for thought.

omnimus · 3 days ago
I think the temptation to use AI is so strong that it will be those who will keep learning who will be valuable in future. Maybe by asking AI to explain/teach instead of asking for solution direclty. Or not using AI at all.
thunky · 3 days ago
> I think the temptation to use AI is so strong that it will be those who will keep learning who will be valuable in future.

AI is an excellent teacher for someone that wants to learn.

zahlman · 3 days ago
> But I don't learn. That's not what I'm trying to do- I'm trying to fix the bug. Hmm. I'm pretty sure AI is going to lead us to a deskilling crash.

Nothing is preventing you from studying how the bugfix works once it's in place.

Nor is there any reason this use of AI should cause you to lose skills you already have.

golly_ned · 3 days ago
I haven't seen things work like this in practice, where heavy AI users end up being able to generating a solution, then later grasp it and learn from it, with any kind of effectiveness or deep understanding.

It's like reading the solution to a math proof instead of proving it yourself. Or writing a summary of a book compared to reading one. The effort towards seeing the design space and choosing a particular solution doesn't exist; you only see the result, not the other ways it could've been. You don't get a feedback loop to learn from either, since that'll be AI generated too.

It's true there's nothing stopping someone from going back and trying to solve it themselves to get the same kind of learning, but learning the bugfix (or whatever change) by studying it once in place just isn't the same.

And things don't work like that in practice any more than things like "we'll add tests later" end up being followed through with with any regularity. If you fix a bug, the next thing for you to do is to fix another bug, or build another feature, write another doc, etc., not dwell on work that was already 'done'.

Karliss · 2 days ago
Often it's less about learning from the bugfix itself but the journey. Learning how various pieces of software operate and fit together, learning the tools you tried for investigating and debugging the problem.
deepspace · 3 days ago
> I'm pretty sure AI is going to lead us to a deskilling crash.

That's my thought too. It's going to be a triple whammy

1. Most developers (Junior and Senior) will be drawn in by the temptation of "let the AI do the work", leading to less experience in the workforce in the long term.

2. Students will be tempted to use AI to do their homework, resulting in new grads who don't know anything. I have observed this happen first hand.

3. AI-generated (slop) code will start to pollute Github and other sources used for future LLM training, resulting in a quality collapse.

I'm hoping that we can avoid the collapse somehow, but I don't see a way to stop it.

pphysch · 3 days ago
On the contrary, being able to access (largely/verifiably) correct solutions to tangible & relevant problems is an extremely great way to learn by example.

It should probably be supplemented with some good old RTFM, but it does get us somewhat beyond the "blind leading the blind" StackOverflow paradigm of most software engineering.

JeremyNT · 3 days ago
I think seniors know enough to tell whether they need to learn or not. At least that's what I tell myself!

The thing with juniors is: those who are interested in how stuff works now have tools to help them learn in ways we never did.

And then it's the same as before: some hires will care and improve, others won't. I'm sure that many juniors will be happy to just churn out slop, but the stars will be motivated on their own to build deeper understanding.

BeFlatXIII · 3 days ago
On the other hand, if it's a one-off, you'll have forgotten what you learned by the time you'd need to use that skill again.
PaulStatezny · 3 days ago
But without AI, there are neural connections formed while determining the correct one-off solution.

The neural connections (or lack of them) have longer term comprehension-building implications.

neilv · 3 days ago
> “Number one, my experience is that many of the most junior folks are actually the most experienced with the AI tools. So they're actually most able to get the most out of them.”

Would that experience be from cheating on their homework? Are you sure that's the skill you want to prioritize?

> “Number two, they're usually the least expensive because they're right out of college, and they generally make less. So if you're thinking about cost optimization, they're not the only people you would want to optimize around.”

Hahaha. Sounds like a threat. Additional context for this is that Amazon has a history of stack ranking and per-manager culling quotas, and not as much a reputation for caring about employees like Google did.

> “Three, at some point, that whole thing explodes on itself. If you have no talent pipeline that you're building and no junior people that you're mentoring and bringing up through the company, we often find that that's where we get some of the best ideas.”

I thought the tech industry had given up on training and investing in juniors for long-term, since (the thinking goes) most of them will job-hop in 18 months, no matter how well you nurture. Instead, most companies are hiring for the near-term productivity they can get, very transactionally.

Does AWS have good long-term retention of software engineers?

rossdavidh · 3 days ago
A big, and little-discussed, problem across many industries is that there is no "pipeline" inside any company. Since the 1980's, the idea that you develop your own talent has fallen by the wayside. You hire it from other companies. Inside software, the issue may be bigger, but it exists in many others as well.
neilv · 3 days ago
Does AWS intend to have that pipeline within the company, starting with juniors, like this talk implies?
ygouzerh · 2 days ago
I feel the majority of junior job-hopping is due to the fact that they are often hired for really low, and then proposed just an incremental raise after two years. Instead, if they change company, then they got a big jump.

At least, that's what I saw happening here in Hong Kong for juniors I worked with, not sure for other areas.

byzantinegene · 2 days ago
if it isn't obvious already, his plan is to get other companies to train juniors so AWS can poach them when they become seniors
zelphirkalt · 2 days ago
And only seniors! No measly mid-levels here please!
frostiness · 3 days ago
I can't help but feel this is backpedaling after the AI hype led to people entering university avoiding computer science or those already in changing their major. Ultimately we might end up with a shortage of developers again, which would be amusing.
mjr00 · 3 days ago
I went to university 2005-2008 and I was advised by many people at the time to not go into computer science. The reasoning was that outsourced software developers in low-cost regions like India and SEA would destroy salaries, and software developers should not expect to make more than $50k/year due to the competition.

Even more recently we had this with radiologists, a profession that was supposed to be crushed by deep learning and neural networks. A quick Google search says an average radiologist in the US currently makes between $340,000 to $500,000 per year.

This might be the professional/career version of "buy when there's blood in the streets."

filoleg · 3 days ago
You nailed it on the head, down to the exact examples.

I was still in high school in 2010, and was told the same thing about outsourcing to India/SEA/etc. making a CS degree/career (in the US) a terrible choice. It wasn't just random people saying this either, I was reading about it in the news, online, had some family acquaintances with alleged former software dev career, etc. I didn't listen, and I am glad I didn't.

As I was graduating from college, and deep learning was becoming a new hot thing, I heard the same thing about radiologists, and how they are all getting automated away in the next 5 years. I had no plans to go to med school, and I didn't know anyone at the time who went through it, so I didn't know much about the topic. On the surface, it seemed like a legitimate take, and I just stored it in my head as "sounds about right."

Cue to now, I know more than a few people who went through med school, and am in general more attuned to the market. Turns out, all of that was just another genpop hype, those news articles about "omg radiologists are all getting replaced by computers" stopped from showing up on any of my news feeds, and not a single radiology-specialized med school graduate I know had any issues with getting a job (that paid significantly more than an entry level position at a FAANG).

I have zero idea what point I was trying to make with this comment, but your examples mirror my personal experience with the topic really well.

avgDev · 3 days ago
I went for CS in my late 20s, always tinkered with computers but didn't get into programming earlier. College advisor told me the same thing, and that he went for CS and it was worthless. This was 2012.

I had a job lined up before graduating. Now make high salary for the area, work remotely 98% of the time and have flexible schedule. I'm so glad I didn't listen to that guy.

sublinear · 3 days ago
Yup hearing big talk about competition and doom is a strong signal that there is plenty of demand.

You can either bet on the new unproven thing claiming to change things overnight, or just do the existing thing that's working right now. Even if the new thing succeeds, an overnight success is even more unrealistic. The insight you gain in the meantime is valuable for you to take advantage of what that change brings. You win either way.

codegeek · 3 days ago
My take is that these are not binary issues. With outsourcing, it is true that you can hire someone cheaper in Asian countries but it cannot kill all jobs locally. So what happens is that the absolute average/mediocre get replaced by outsourcing and now with AI while the top talent can still command a good salary because they are worth it.

So I think that a lot of juniors WILL get replaced by AI not because they are junior necessarily but because a lot of them won't be able to add great value compared to a default AI and companies care about getting the best value from their workers. A junior who understands this and does more than the bare minimum will stand out while the rest will get replaced.

VirusNewbie · 2 days ago
haha, I was working in the industry around that time, though quite young and inexperienced and had someone pull me aside to tell me I needed to get out of coding because soon the business PM type guys (like he was) wouldn't need "guys like you" soon.

His two points were one, 'no code' tools (they didn't call it that back then); this idea that full on business apps could get created by non programmers by just tweaking some XML.

Then he was convinced the rest would be done just by cheap indians and chinese programmers.

hrimfaxi · 3 days ago
> Even more recently we had this with radiologists, a profession that was supposed to be crushed by deep learning and neural networks. A quick Google search says an average radiologist in the US currently makes between $340,000 to $500,000 per year.

At the end of the day, radiologists are still doctors.

ravenstine · 3 days ago
Can anyone really blame the students? If I were in their shoes, I probably wouldn't bother studying CS right now. From their perspective, it doesn't really matter whether AI is bullshit in any capacity; it matters whether businesses who are buying the AI hype are going to hire you or not.

Hell, I should probably be studying how to be a carpenter given the level at which companies are pushing vibe coding on their engineers.

bonzini · 3 days ago
Three-four years is a lot of time for these companies to face the harsh reality.
simonw · 3 days ago
"after the AI hype led to people entering university avoiding computer science or those already in changing their major"

That's such a terrible trend.

Reminds me of my peers back in ~2001 who opted not to take a computer science degree even though they loved programming because they thought all the software engineering jobs would be outsourced to countries like India and there wouldn't be any career opportunities for them. A very expensive mistake!

roncesvalles · 3 days ago
Certainly, I even know of experienced devs switching out of tech entirely. I think the next couple of decades are going to be very good for software engineers. There will be an explosion of demand yet a contraction in supply. We are in 2010 again.
DiscourseFan · 3 days ago
There will be programmers of the old ways, but AI is basically code 2.0, there are now a lot of things that are AI specific that those with traditional software development skills can’t do.
Nextgrid · 3 days ago
It's backpedaling but I don't think it's planning ahead to prevent a developer shortage - rather it's pandering to the market's increasing skepticism around AI and that ultimately the promised moonshot of AI obsoleting all knowledge work didn't actually arrive (at least not in the near future).

It's similar to all those people who were hyping up blockchain/crypto/NFTs/web3 as the future, and now that it all came to pass they adapted to the next grift (currently it's AI). He is now toning down his messaging in preparation of a cooldown of the AI hype to appear rational and relevant to whatever comes next.

seg_lol · 3 days ago
"We were against this all along"
andrewl-hn · 3 days ago
Perhaps, their own hiring pipeline is suffering, too. With most companies out there cutting internships and hiring of people with no experience "because AI will replace them" for the past 2-3 years we probably having a large dip in number of prospective candidates with 2-3 years of experience today.

Historically, these candidates have been the hiring sweet spot: less risky than brand new engineers, still small enough experience to efficiently mold them into your bespoke tools and processes and turn them into long-term employees, and still very cheap.

ay · 3 days ago
Reading this article is especially amusing since this bit just hit the news as well:

https://www.business-standard.com/amp/world-news/amazon-euro...

fullshark · 3 days ago
Or maybe they realize the AI needs humans in the loop for the foreseeable future for enterprise use cases and juniors (and people from LCL areas) are cheaper and make the economics make some sort of sense.
burningChrome · 3 days ago
Agreed.

Considering the talk around junior devs lately on HN, there's way too many of them, it would indeed be amusing.

raincole · 3 days ago
> changing their major

To what?

ok123456 · 3 days ago
So he's saying we should be replacing the seniors with fresh grads who are good at using AI tools? Not a surprising take, given Amazon's turnover rate.

Dead Comment

epolanski · 3 days ago
My experience is that juniors have an easier time to ramp up, but never get better at proper engineering (analysis) and development processes (debug). They also struggle to read and review code.

I fear that unless you heavily invest in them and follow them, they might be condemned to have decades of junior experience.

PartiallyTyped · 3 days ago
I have the same experience.

In my view there's two parts to learning, creation and taste, and both need to be balanced to make progress. Creation is, in essence, the process of forming pathways that enable you to do things, developing taste is the process of pruning and refining pathways to doing things better.

You can't become a chef without cooking, and you can't become a great one without cultivating a taste (pun intended) for what works and what it means for something to be good.

From interactions with our interns and new-grads, they lack the taste, and rely too much on the AI for generation. The consequence is that when you have conversations with them, they straggle to understand the concepts and tools they are using because they lack the familiarity that comes with creation, and they lack the skills to refine the produced code into something good.

tayo42 · 3 days ago
> but never get better at proper engineering (analysis) and development processes (debug). They also struggle to read and review code.

You can describe pre-ai developers and like that too. It's probably my biggest complaint about some of my Co workers

epolanski · 3 days ago
To some extend, you're right, but I'd still say that pre AI you had to at some point to write some notes, come with a plan and read more code.
veunes · 2 days ago
If a quick start with AI is inevitable, then mentorship and review programs need to be re-evaluated. Seniors shouldn't just check for functionality; they should actively ask juniors to explain why the AI suggested a particular solution, what the alternatives are, and what risks it entails. The focus must shift to understanding, not just generation