How does one explain the drop starting January 2023 (esp for things like Customer Service Rep, which is an NLP-heavy task) when most corporations didnt even start LLM/NLP pilots until mid/late 2023? I skimmed thru the 100+ page paper but didnt see an explanation for this strange leading effect.
SWE figures dropped mid-2022 (almost magically in line with interest rate hikes) and LLM-copilots werent introduced for another year. The paper notes they did an adjustment for the end of ZIRP. I dont know enough econometrics to understand whether this adjustment was sufficient, but the chart doesnt make sense since the labor efforts seem to be leading the actual technology by over a year or more. From informal surveys, LLM-copilot usage didnt become widespread until late 2023 to mid 2024, certainly not widespread enough to cause macro labor effects in mid-2022.
The 2022 drop for SWE is easy for me to explain, and it's not on these analysts' list of factors (though I'm not an economic quant, I don't know how you could really control for it): In 2017, a tax bill was passed that cut a particular tax incentive in 2022 in an effort to be counted as "revenue neutral" despite being otherwise a massive tax cut overall. The incentive in question was a writeoff for "Research and development". This means that in 2022, it got effectively much more expensive to hire anyone who falls under that category, including developers not directly necessary for the day-to-day function of a business (hell, one might argue they would have counted anyway) and scientists of most kinds. That this hit big firms, which have a higher relative amount of R&D efforts going at a given time, first makes a lot of sense.
For customer service, my explanation is that companies literally do not care about customer service. Automated phone trees, outsourced call centers whose reps have no real power to help a customer, and poorly-made websites have been frustrating people for decades, but businesses never seem to try to compete on doing better at it. It's a cheap win with investors who want to hear about AI initiatives to lay off yet even more of this department, because it doesn't matter if the quality of service declines, there are no market or regulatory forces that are punishing this well enough to ever expect firms to stop breaking it, let alone fix it
Love this note. For those interested, this is the Tax Cuts and Jobs Act (TCJA) of 2017 Section 179.
For a software engineering business, the Tax Cuts and Jobs Act (TCJA) of 2017 significantly impacted how software costs can be expensed under Section 179. While Section 179 previously allowed for the immediate expensing of many software purchases, TCJA reforms restricted this deduction primarily to "off-the-shelf" software. Custom-developed software and internal development costs are no longer eligible for Section 179 expensing and must now be capitalized and amortized.
Under the TCJA, Section 179 cannot be used for software that a company develops for itself. This includes the direct costs for the engineers, programmers, and other personnel involved in the development process.
The report not addressing this elephant in the room is a disappointing.
I was working in Europe for a big American company, which will remain nameless, and they started shutting down most, if not all, of their European operations.
A change in the US tax code made software development amortize over 5 years in the US and over 15 years overseas. It was later changed instant deduction in the US but still 15 years for overseas. It no longer makes sense to outsource software development in many cases.
> It's a cheap win with investors who want to hear about AI initiatives to lay off yet even more of this department, because it doesn't matter if the quality of service declines, there are no market or regulatory forces that are punishing this well enough to ever expect firms to stop breaking it, let alone fix it
There's also some argument that, if people cannot get customer service to "help" they stop asking for help - driving that cost down.
And not having to remedy issues in the product = no repair/replace cost
And people are then left with only a few options, one of which... buy a replacement... which in a restricted market is a WIN because more money coming in...
And something to note - this cut has been reinstated as part of the Big Beautiful Bill. Which has passed. I think the drop in jobs between now and a year from
now atleast be separated as AI vs just interest rates. There are less confounding variables.
I've heard this complaint/observation many times and I just don't buy it. For one thing, particularly for large companies, the deduction smooths out. Yes, you can only deduct 20% of the costs this year but you're also deducting 20% from the previous year, 20% from the eyar before that and so on.
Also, the 2017 tax cuts and the recent bill have provided substantial tax cuts to these corporations too.
Usually this subject comes up where people (at least on HN) are telling people to mail their Congresspeople and Senators to get a bill passed to "fix" this and my question is always this:
"What tax cuts are you going to give back to pay for this?"
If we want to end this ridiculous IP transfer to Ireland and royalty payments to offshore profits to avoid taxes at the same time, I'm 100% on board with fixing the deductability of engineering salaries.
You absolutely misunderstand Section 174 and you are spreading misinformation.
The only companies this affected are those right at the margins of becoming profitable. It doesn't affect new startups and it doesn't affect established businesses. And if you are at the margins of becoming profitable you have likely accumulated more than enough tax credits for all your losses.
The changes to Section 174 is not the explanation of why software engineering jobs were lost in 2022. They were lost because every company overhired from 2020-2022 and they have to absorb it given the drop in activity once the Pandemic was over.
I do consulting, I'm constantly scouting clients. Right around November 2022 something very stark happened. I went from fighting off prospects with a stick, to crickets, almost over night. I deal mostly with startups and mid-size companies, nobody with insider knowledge or cutting edge interests. I can tell you that GPT was not heavily on anyone I dealt with's radar as an opportunity to reduce costs.
Some sort of cultural zeitgeist occurred, but in terms of symptoms I saw with my own eyes, I think ZIRP ending (projects getting axed) and layoffs starting (projects getting filled within ~24 hours) were huge drivers. I have no proof.
I had the same thoughts, there are clearly indicators that the weakness in the labor market started happening before LLMs and AI took over popular discourse.
All the more reason to believe that while correlated, LLMs are certainly not the largest contributor, or even the cause of the job market weakness for young people. The more likely and simple explanation is that there are cracks forming in the economy not just in the US but globally; youth employment is struggling virtually everywhere. Can only speculate on the reasons, but delayed effects from questionable monetary and fiscal policy choices, increasing wealth gaps, tariffs, geopolitics, etc. have certainly not helped.
>The paper notes they did an adjustment for the end of ZIRP. I dont know enough econometrics to understand whether this adjustment was sufficient
Looking at the paper [0], they attempted to do it by regressing the number of jobs y_{c,q,t} at company c, time t, and "AI exposure quintile" q, with separate parameters jointly controlling for company/quintile (a), company/time (b) and quintile/time (g). This is in Equation 4.1, page 15, which I have simplified here:
log(y_{c,q,t}) ~ a_{c,q} + b_{c,t} + g_{q,t}
Any time-dependent effects (e.g. end of ZIRP/Section 174) that would equally affect all jobs at the company irrespective of how much AI exposure they have should be absorbed into b.
They normalized g with respect to October 2022 and quintile 1 (least AI exposure), and plotted the results for each age group and quintile (Figure 9, page 20). There is a pronounced decline that only starts in mid-2024 for quintiles 3, 4, and 5 in the youngest age group. The plots shown in the article are misleading, and are likely primarily a reflection of ZIRP, as you say. The real meat of the paper is Figure 9.
A potential flaw of this method is that ZIRP/Section 174 may have disproportionately affected junior positions with high AI exposure, e.g. software engineers. This would not be accounted for in b and would thus be reflected in g. It would be interesting to repeat this analysis excluding software engineers and other employees subject to Section 174.
Yeah my company started stepping up outsourcing in 2023. We also started some AI projects. The AI projects haven't made much progress but the outsourcing is at an extremely advanced stage.
I personally sat in meetings in 2022 where we adjusted staffing projections in anticipation of AI efficiency. Sure some of it was "overhiring," but the reality was that those staffing goals were pre-ai. Once they were updated, that's when the layoffs started because management didn't want anyone who didn't have an AI or big data background.
Gen-AI was still extremely niche in 2022; ChatGPT didn't come out until the end of the year, on 30 November, and it was pretty much just a toy curiosity until mid-2023 when GPT-4 came out. I am very surprised that leadership at your company was seriously discussing the business impact of AI that early on.
in addition to this detail I might add I can't remember the last time I had a customer service call that took place with someone stateside. It's easy to point to AI when offshoring for favorable interest rates is really the reason.
I think it's lipstick on a pig. We've seen tech companies collude before, and I'm guessing they're doing it again, trying to drive down the price of talent and make their employees less demanding.
I remember years of a no-backfill policy at Devon on a promises of automation. Since 2017 at least. The “desirable job market” for young people has been challenging well before LLM became popular. Want a dead end entry level job in food service earning $20/hour? No problem.
the economy actually creates all the jobs ever since hunt and gather. the buggy whip jobs did eventually dry up, but the economy continues to create other jobs, paid for by ever increasing surpluses.
> SWE figures dropped mid-2022 (almost magically in line with interest rate hikes) and LLM-copilots werent introduced for another year
It was pretty clear by late 2022 that AI assisted coding was going to transform how software development was done. I remember having conversations with colleagues at that time about how SWE might transform into an architecture and systems design role, with transformer models filling in implementations.
If it was clear to workers like us, it was pretty clear to the c-suite. Not that it was the only reason for mass layoffs, but it was a strong contributor to the rationale.
Many large companies were placing a bet that there were turbulent times ahead, and were lightening their load preemptively.
AI is barely a blip on why the job market is dead for entry level, and dying all the way up the ladder.
Every one of my engineer friends says the same thing. "My team is 80% indians" and more than half are not qualified for the job they have.
The whole thing is a fucking scam for them, every company, top to bottom. Recruiters, hiring managers, referrals, CEO's. All one thing in common.
I'll take my downvotes, I don't care, everyone here knows I'm right. And those with their head up their ass can enjoy getting replaced and spending years looking for another role.
It is possible that multiple trends are coalescing
1. layoffs after web3 hiring spree
2. End of Zirp
However I think now, in 2025 is it impossible to reasonably claim AI isn't making an impact in hiring. Those who disagree on here seem to be insistent on some notion that AI has no benefits whatsoever, thus could never cause job loss.
Completely agree with you.
You can't be an economist and criticize current monetary policy. You will be labeled a crackpot, ostracized and have difficulty with grants. Grants from who? The same institution you'll be critizing in the first place.
Exactly this - I've said it before and will say it again - new technologies emerge in response to trends, often to accelerate existing trends and does not create them.
I see a few explanations for what you're saying, and those might be true, but I strongly believe part of it is investment (particularly VC, less so PE) has hit diminishing returns in tech and which means less subsidized "disruption", which means less money to hire people. AI becoming hugely popular right when this was happening is not a coincidence. And it's not just startups, less investment in startups also mean less clients for AWS and Azure. A16Z / Sand Hill switching to AI is not them just chasing the latest trend, it's a bid to reduce cost on people, which is the most expensive part of a tech company, as the only way to extend their unicorn-focused investment strategy.
This reminds me of the part of The Book of Why by Judea Pearl discussing how do calculus and the causal revolution came about with the simple insight that effects come before causes and so do calculus was invented to keep track of that in the math, rather than obscuring that with statistical relations that worked in either direction.
SWE, i know some worfloe digitalisation projects that had been in the planning for a very long time, budgetted as multiple person tear efforts, that due to pandemic nescessity were exected by a team of 3 over a long weekend in 2020. This did not go unnoticed, neither by customers nor swe providers.
Chat gpt 3 was 2020, even if the technology wasn't mature the hype was there informing investment and hiring decisions.
There was also other factors, there were covid booms, covid busts, overcorrections, Elon shoes you can cut by 90% and still keep a product running (kind of) and with X taking the flack other people followed suit without being as loud. There is a fairly major war in Europe ....
The drop started from mid-2022 and 2023 and there is a single cause: Russian assets freeze. This led to governments around the world moving their assets from West/AngloSaxon countries. This lack of liquidity put the West in a “hang in there” situation. Economically, it showed up as raising interest rates and politically where things move slower, the emergence of a new coalition.
It’s really as simple as that. But people would like to believe that West GDP is higher than global south GDP by xxx amounts and so all of this couldn’t be possible.
If you want an insight inside their heads, there is a Biden speech after the assets freeze where he declares that the Russian economy/country will collapse in a few weeks under the measures. None of this materialized and their bet have failed which is why Trump is trying to pull the US out of the mess.
Of course all of this is my personal opinion. So take it from the grain in my bag of salt.
I made a stupid simple model where hiring in all age brackets rose slowly until 2021 and then fell slowly. That produces very similar looking graphs, because the many engineers that were hired at the peak move up the demographic curve over time. Normalizing the graph to 2022 levels, as the paper seems to do, hides the fact that the actual hiring ratios didn't change at all.
I'm not sure I understand. Your model shows that different group buckets (eg 20-24yo vs 25-29yo) peak at different years (in your figure, 2022 vs 2024) despite being driven by the same dynamics. Is that expected? I (naively?) expected the same groups to rise, fall and have peaks at the same times.
One of the dynamics is that people get older so they move into different buckets.
We can make the model way simpler to make it clearer. Say in 2020 we hired 1000 20-24yo, 1000 25-29yo etc and then we didn't hire anyone since then. That was five years ago, so now we have 0 20-24yo, 1000 25-29yo, 1000 30-34yo etc and 1000 retirees who don't show up in the graph.
Each individual year we hired the exact same number of people in each age bracket, and yet we still end up with fewer young people total whenever hiring goes down, because all the people that got hired during the big hiring spike are now older.
Wow, that's hilarious. So essentially hiring could be identical across all age groups, but due to a glitch in the analysis (young people don't stay young, who knew?), it appears that younger people are losing jobs more than the rest.
I think not hiring juniors is a tragedy of the commons situation. It started before the AI boom, during COVID. It's not tax-related as people claim here, since this phenomenon is not US-only.
The ZIRP era made companies hire people as if there was no tomorrow, and companies started "poaching" engineers from others, including juniors. I saw some interns with 2 years of experience getting offers as seniors. I had friends being paid to attend boot camp.
Then everyone realized they were training junior engineers who would quickly get offers from other companies as “Senior" and leave. So companies stopped hiring them.
We need to reframe it. At this point, what we call "AI" is not a technology, but a subscription company.
A technology is a tool you can adopt in your toolchain to perform at task, even if in this case it's outsourcing cognitive load. For a subscription company, well, as long as the subscription is active, you get to outsource some of the cognitive load. When Anthropic's CEO says that white color jobs will disappear, he means that he is selling Enterprise subscriptions, and that companies will inevitably buy it.
It's only a matter of time before AI becomes a commodity. The open source models are just a generation behind the proprietary ones and they're almost good enough for most users. Even if we're somehow limited to using proprietary models, since all of these "AIs" are based on natural language so you can practically hot swap the LLMs. Any company banking on selling any service other than AI hosting is going to be incredibly disappointed.
The only thing that could stop this commodification is some sort of vendor lock-in, but that looks to be technically challenging.
Windows gaming finally falling to desktop Linux, in terms of tech and mindshare among enthusiasts, proves to me that given a sufficiently large timescale open solutions tend to always win against a proprietary ones.
Companies just can't seem to stay focused on their own core competencies after a few decades. On the other hand, even when open projects "die" they can be resurrected ad infinitum.
Just my gut feeling but it seems like closed wins out big in the short term, but open wins in the end. Not applicable to everything I know.
Any economic data from between 2020 and 2025 should be tossed in the garbage. We will have no idea what affect AI has or hasn't had until AI has been available outside of the extremely confounded current circumstances. Tell me how employment looks after the next recession when the after effects of the pandemic, rapid inflation, interest rate unpredictability, and tariff whiplash are hopefully all behind us.
And the data for the two decades before that should be tossed out because of the global housing crisis, the sovereign debt crisis, and all the reverberations from that.
And the data for the two decades before that obviously needs to be tossed out because of the one-off nature of the dot-com boom, the dot-com crash, 9/11, and so on.
And before that... point is, you don't get clean data in economics. There's always something big going on, there are no double-blind trials to run, etc. It's called the dismal science for a reason. But that doesn't make it useless.
If your only dataset is 2008 to 2010, yeah your conclusions are almost certainly garbage. Likewise if your dataset is 1999 to 2002, or just the year 1987.
There will always be something going on, but there won't always be this specific thing going on, and if 100% of your data comes from during one specific crisis (nonetheless multiple concurrent crises), then it is 100% useless. Even with longer periods of data, reconcilliation with longer trends is technically challenging. We've had enough recessions to correct for them in our data, but until we've had a half dozen global pandemics you can't use pandemic data in your analysis.
Lots of apparently unexplored alternative explanations. In times of uncertainty, you don't hire unless you need to. Another junior developer or customer service agent can be delayed, and youngsters in the labor force are most exposed to this. But if you need a home health aide, you probably don't have a lot of choice: somebody has to change grandma's diapers. Tariffs top my list of uncertainties for businesses, but interest rates are a close second.
"AI" dives in and disrupts and then it turns out that AI isn't too I. The disrupt phase where HR dumps staff based on dubious promises and directions from above takes a few months. The gradual re-hiring takes way longer than the dumping phase and will not trigger thresholds.
I've spent quite a while with "AI". LLMs do have a use but dumping staff is not one of the best ideas I've seen. I get that a management team are looking for trimmings but AI isn't the I they are looking for.
In my opinion (MD of a small IT focused company) LLMs are a better slide rule. I have several slide rules and calculators and obviously a shit load of computers. Mind you my slide rules can't access the internet, on the other hand my slide rules always work, without internets or power.
I started university (in Australia) in 2004, not long after the dot com crash. CS enrolment rates were low, kids were getting scared off due to perceived lack of jobs. As a result, there was a shortage of grad talent (companies were already ramping up hiring again by 2004). I got a grad job just fine, in 2008, and I've never been short of work since.
So my advice to high school kids of 2025: right now is the perfect time to enrol in CS. 5 years from now, the AI hype will be over, and employers will be short on grads.
> 5 years from now, the AI hype will be over, and employers will be short on grads*
alterative view: the AI hype is real, AI takes over, and no one has any jobs anyway
also a thought: in 5 years the boomers will be retiring in droves as will the first series of GenX and the market, in most fields, should be opening up anyway
Experience. You learn to ignore what people _believe_ it does and what they _say_ does and look at what it _actually_ does.
It's fun to play with, but LLMs can't reason and are fundamentally unreliable. They cannot be made reliable.
The actual market uses for large quantities of human-like text is, like, autocomplete and spam.
I use copilot as fancy autocomplete and like it, but ~all claims that it will replace SWEs are by people selling AI or people who fundamentally do not understand what SWEs actually do.
It'll probably replace some offshoring, ironically lol
This time is different. A fact right now is that software engineers now can orchestrate LLMs and agents to write software. The role of software engineers who do this is quality control, compliance, software architecture and some out of the box thinking for when LLMs do not cut it. What makes you think advances in AI wont take care of these tasks that LLMs do not do well currently? My point is once these tasks are taken care off a CS graduate won't be doing tasks that they learnt to do in their degrees. What people need to learn is how to think of customers needs in abstract ways and communicate this to AI and judge the output in a similar way someone judges a painting.
> CS graduate won't be doing tasks that they learnt to do in their degrees
how is that different from the previous decade(s)? How often do you invert a redblack tree in your daily programming/engineering job?
A CS degree is a degree for thinking computationally, using mathematics as a basis. It's got some science too (aka, use evidence and falsifiability to work out truths and not rely on pure intuition). It's got some critical thinking attached, if your university is any good at making undergraduate courses.
A CS degree is not a boot camp, nor is it meant to make you ready for a job. While i did learn how to use git at uni, it was never required nor asked - it was purely my own curiosity, which a CS degree is meant to foster.
SWE figures dropped mid-2022 (almost magically in line with interest rate hikes) and LLM-copilots werent introduced for another year. The paper notes they did an adjustment for the end of ZIRP. I dont know enough econometrics to understand whether this adjustment was sufficient, but the chart doesnt make sense since the labor efforts seem to be leading the actual technology by over a year or more. From informal surveys, LLM-copilot usage didnt become widespread until late 2023 to mid 2024, certainly not widespread enough to cause macro labor effects in mid-2022.
For customer service, my explanation is that companies literally do not care about customer service. Automated phone trees, outsourced call centers whose reps have no real power to help a customer, and poorly-made websites have been frustrating people for decades, but businesses never seem to try to compete on doing better at it. It's a cheap win with investors who want to hear about AI initiatives to lay off yet even more of this department, because it doesn't matter if the quality of service declines, there are no market or regulatory forces that are punishing this well enough to ever expect firms to stop breaking it, let alone fix it
For a software engineering business, the Tax Cuts and Jobs Act (TCJA) of 2017 significantly impacted how software costs can be expensed under Section 179. While Section 179 previously allowed for the immediate expensing of many software purchases, TCJA reforms restricted this deduction primarily to "off-the-shelf" software. Custom-developed software and internal development costs are no longer eligible for Section 179 expensing and must now be capitalized and amortized.
Under the TCJA, Section 179 cannot be used for software that a company develops for itself. This includes the direct costs for the engineers, programmers, and other personnel involved in the development process.
The report not addressing this elephant in the room is a disappointing.
I was working in Europe for a big American company, which will remain nameless, and they started shutting down most, if not all, of their European operations.
A change in the US tax code made software development amortize over 5 years in the US and over 15 years overseas. It was later changed instant deduction in the US but still 15 years for overseas. It no longer makes sense to outsource software development in many cases.
There's also some argument that, if people cannot get customer service to "help" they stop asking for help - driving that cost down.
And not having to remedy issues in the product = no repair/replace cost
And people are then left with only a few options, one of which... buy a replacement... which in a restricted market is a WIN because more money coming in...
Also, the 2017 tax cuts and the recent bill have provided substantial tax cuts to these corporations too.
Usually this subject comes up where people (at least on HN) are telling people to mail their Congresspeople and Senators to get a bill passed to "fix" this and my question is always this:
"What tax cuts are you going to give back to pay for this?"
If we want to end this ridiculous IP transfer to Ireland and royalty payments to offshore profits to avoid taxes at the same time, I'm 100% on board with fixing the deductability of engineering salaries.
The only companies this affected are those right at the margins of becoming profitable. It doesn't affect new startups and it doesn't affect established businesses. And if you are at the margins of becoming profitable you have likely accumulated more than enough tax credits for all your losses.
The changes to Section 174 is not the explanation of why software engineering jobs were lost in 2022. They were lost because every company overhired from 2020-2022 and they have to absorb it given the drop in activity once the Pandemic was over.
Some sort of cultural zeitgeist occurred, but in terms of symptoms I saw with my own eyes, I think ZIRP ending (projects getting axed) and layoffs starting (projects getting filled within ~24 hours) were huge drivers. I have no proof.
The Twitter layoffs perhaps?
Oct 2022 recorded the lowest for S&P 500 since COVID (till now).
COVID assistance was over. Vaccination reached a critical majority. On Sep 2022, Biden declared "COVID-19 pandemic was over" [1].
Businesses got a reality check.
1: https://en.wikipedia.org/wiki/COVID-19_pandemic_in_the_Unite...
All the more reason to believe that while correlated, LLMs are certainly not the largest contributor, or even the cause of the job market weakness for young people. The more likely and simple explanation is that there are cracks forming in the economy not just in the US but globally; youth employment is struggling virtually everywhere. Can only speculate on the reasons, but delayed effects from questionable monetary and fiscal policy choices, increasing wealth gaps, tariffs, geopolitics, etc. have certainly not helped.
Interesting point. With Baby Boomers retiring everywhere and fertility falling everywhere, one would expect fierce competition for young workers.
Looking at the paper [0], they attempted to do it by regressing the number of jobs y_{c,q,t} at company c, time t, and "AI exposure quintile" q, with separate parameters jointly controlling for company/quintile (a), company/time (b) and quintile/time (g). This is in Equation 4.1, page 15, which I have simplified here:
log(y_{c,q,t}) ~ a_{c,q} + b_{c,t} + g_{q,t}
Any time-dependent effects (e.g. end of ZIRP/Section 174) that would equally affect all jobs at the company irrespective of how much AI exposure they have should be absorbed into b.
They normalized g with respect to October 2022 and quintile 1 (least AI exposure), and plotted the results for each age group and quintile (Figure 9, page 20). There is a pronounced decline that only starts in mid-2024 for quintiles 3, 4, and 5 in the youngest age group. The plots shown in the article are misleading, and are likely primarily a reflection of ZIRP, as you say. The real meat of the paper is Figure 9.
A potential flaw of this method is that ZIRP/Section 174 may have disproportionately affected junior positions with high AI exposure, e.g. software engineers. This would not be accounted for in b and would thus be reflected in g. It would be interesting to repeat this analysis excluding software engineers and other employees subject to Section 174.
[0] https://digitaleconomy.stanford.edu/wp-content/uploads/2025/...
American Express
And a large bank headquartered in Virginia
I think USAA but that was two years ago
I called bank of america’s credit card line and asked for a support agent. A friendly lady answered, she had a southern accent. :)
the economy actually creates all the jobs ever since hunt and gather. the buggy whip jobs did eventually dry up, but the economy continues to create other jobs, paid for by ever increasing surpluses.
It was pretty clear by late 2022 that AI assisted coding was going to transform how software development was done. I remember having conversations with colleagues at that time about how SWE might transform into an architecture and systems design role, with transformer models filling in implementations.
If it was clear to workers like us, it was pretty clear to the c-suite. Not that it was the only reason for mass layoffs, but it was a strong contributor to the rationale.
Many large companies were placing a bet that there were turbulent times ahead, and were lightening their load preemptively.
Every one of my engineer friends says the same thing. "My team is 80% indians" and more than half are not qualified for the job they have.
The whole thing is a fucking scam for them, every company, top to bottom. Recruiters, hiring managers, referrals, CEO's. All one thing in common.
I'll take my downvotes, I don't care, everyone here knows I'm right. And those with their head up their ass can enjoy getting replaced and spending years looking for another role.
1. layoffs after web3 hiring spree
2. End of Zirp
However I think now, in 2025 is it impossible to reasonably claim AI isn't making an impact in hiring. Those who disagree on here seem to be insistent on some notion that AI has no benefits whatsoever, thus could never cause job loss.
I sense some conflation of causation/correlation at hand.
Deleted Comment
I see a few explanations for what you're saying, and those might be true, but I strongly believe part of it is investment (particularly VC, less so PE) has hit diminishing returns in tech and which means less subsidized "disruption", which means less money to hire people. AI becoming hugely popular right when this was happening is not a coincidence. And it's not just startups, less investment in startups also mean less clients for AWS and Azure. A16Z / Sand Hill switching to AI is not them just chasing the latest trend, it's a bid to reduce cost on people, which is the most expensive part of a tech company, as the only way to extend their unicorn-focused investment strategy.
They just supply what people want and follow the trends.
My understanding is the same thing recently happened to pharmacists.
There was also other factors, there were covid booms, covid busts, overcorrections, Elon shoes you can cut by 90% and still keep a product running (kind of) and with X taking the flack other people followed suit without being as loud. There is a fairly major war in Europe ....
It’s really as simple as that. But people would like to believe that West GDP is higher than global south GDP by xxx amounts and so all of this couldn’t be possible.
If you want an insight inside their heads, there is a Biden speech after the assets freeze where he declares that the Russian economy/country will collapse in a few weeks under the measures. None of this materialized and their bet have failed which is why Trump is trying to pull the US out of the mess.
Of course all of this is my personal opinion. So take it from the grain in my bag of salt.
https://docs.google.com/spreadsheets/d/1z0l0rNebCTVWLk77_7HA...
We can make the model way simpler to make it clearer. Say in 2020 we hired 1000 20-24yo, 1000 25-29yo etc and then we didn't hire anyone since then. That was five years ago, so now we have 0 20-24yo, 1000 25-29yo, 1000 30-34yo etc and 1000 retirees who don't show up in the graph.
Each individual year we hired the exact same number of people in each age bracket, and yet we still end up with fewer young people total whenever hiring goes down, because all the people that got hired during the big hiring spike are now older.
The ZIRP era made companies hire people as if there was no tomorrow, and companies started "poaching" engineers from others, including juniors. I saw some interns with 2 years of experience getting offers as seniors. I had friends being paid to attend boot camp.
Then everyone realized they were training junior engineers who would quickly get offers from other companies as “Senior" and leave. So companies stopped hiring them.
A technology is a tool you can adopt in your toolchain to perform at task, even if in this case it's outsourcing cognitive load. For a subscription company, well, as long as the subscription is active, you get to outsource some of the cognitive load. When Anthropic's CEO says that white color jobs will disappear, he means that he is selling Enterprise subscriptions, and that companies will inevitably buy it.
The only thing that could stop this commodification is some sort of vendor lock-in, but that looks to be technically challenging.
Companies just can't seem to stay focused on their own core competencies after a few decades. On the other hand, even when open projects "die" they can be resurrected ad infinitum.
Just my gut feeling but it seems like closed wins out big in the short term, but open wins in the end. Not applicable to everything I know.
And the data for the two decades before that obviously needs to be tossed out because of the one-off nature of the dot-com boom, the dot-com crash, 9/11, and so on.
And before that... point is, you don't get clean data in economics. There's always something big going on, there are no double-blind trials to run, etc. It's called the dismal science for a reason. But that doesn't make it useless.
There will always be something going on, but there won't always be this specific thing going on, and if 100% of your data comes from during one specific crisis (nonetheless multiple concurrent crises), then it is 100% useless. Even with longer periods of data, reconcilliation with longer trends is technically challenging. We've had enough recessions to correct for them in our data, but until we've had a half dozen global pandemics you can't use pandemic data in your analysis.
"AI" dives in and disrupts and then it turns out that AI isn't too I. The disrupt phase where HR dumps staff based on dubious promises and directions from above takes a few months. The gradual re-hiring takes way longer than the dumping phase and will not trigger thresholds.
I've spent quite a while with "AI". LLMs do have a use but dumping staff is not one of the best ideas I've seen. I get that a management team are looking for trimmings but AI isn't the I they are looking for.
In my opinion (MD of a small IT focused company) LLMs are a better slide rule. I have several slide rules and calculators and obviously a shit load of computers. Mind you my slide rules can't access the internet, on the other hand my slide rules always work, without internets or power.
So my advice to high school kids of 2025: right now is the perfect time to enrol in CS. 5 years from now, the AI hype will be over, and employers will be short on grads.
alterative view: the AI hype is real, AI takes over, and no one has any jobs anyway
also a thought: in 5 years the boomers will be retiring in droves as will the first series of GenX and the market, in most fields, should be opening up anyway
It's fun to play with, but LLMs can't reason and are fundamentally unreliable. They cannot be made reliable.
The actual market uses for large quantities of human-like text is, like, autocomplete and spam.
I use copilot as fancy autocomplete and like it, but ~all claims that it will replace SWEs are by people selling AI or people who fundamentally do not understand what SWEs actually do.
It'll probably replace some offshoring, ironically lol
how is that different from the previous decade(s)? How often do you invert a redblack tree in your daily programming/engineering job?
A CS degree is a degree for thinking computationally, using mathematics as a basis. It's got some science too (aka, use evidence and falsifiability to work out truths and not rely on pure intuition). It's got some critical thinking attached, if your university is any good at making undergraduate courses.
A CS degree is not a boot camp, nor is it meant to make you ready for a job. While i did learn how to use git at uni, it was never required nor asked - it was purely my own curiosity, which a CS degree is meant to foster.