Readit News logoReadit News
bachmeier · 6 months ago
Okay. It also coincides with the end of the post-pandemic hiring boom and the UK bank rate going from 0.1% to 5.25%. It's kind of funny that reliable data analysis has never been part of the AI hype when you consider that AI is used for data analysis.
heresie-dabord · 6 months ago
There are complex economic shifts happening but LLMs ("AI") have little practical to do with it.

Stupendous loads of money have been allocated to a solution looking for a problem to solve.

https://www.gartner.com/en/newsroom/press-releases/2025-06-2...

causal · 6 months ago
That reallocation of capital is also a culprit
bunderbunder · 6 months ago
> It's kind of funny that reliable data analysis has never been part of the AI hype when you consider that AI is used for data analysis.

If you've ever tried to use AI to help with this kind of analysis, you might find this to be more inevitable than it is funny.

It's really, really, really good at confidently jumping to hasty conclusions and confirmation bias. Which perhaps shouldn't be surprising when you consider that it was largely trained on the Internet's proverbial global comments section.

banannaise · 6 months ago
I presume when they say "AI is used for data analysis" they're talking about traditional AI (more frequently referred to as "machine learning") rather than generative AI (LLMs).
0x20cowboy · 6 months ago
> It's really, really, really good at confidently jumping to hasty conclusions and confirmation bias.

Kind of like entry level software engineers.

I am kidding, I believe the market has more to do with tax changes than AI. I just couldn't pass up the joke.

orochimaaru · 6 months ago
In the US I think it’s may be driven more by r&d cost amortization changes since 2023. It’s attributed to AI but I believe tax implications are to blame as well apart from interest rates and the covid hiring.
klipklop · 6 months ago
Hopefully that will be fixed this year, or tech layoffs and outsourcing in the US will pick up pace. Without the R&D write off, each dev in the US is a massive financial black hole vs hiring outside the US.
eru · 6 months ago
> It also coincides with the end of the post-pandemic hiring boom and the UK bank rate going from 0.1% to 5.25%.

I agree that the former is a strong signal. However the latter doesn't tell you anything without further context: did interest rates go up, because the economy was strong, or did rising interest rates dampen the economy?

(It's similar to how you can't tell how hot it is in my apartment, purely from looking at my heating bills: does a low heating bill mean that it's cold in my flat, because I'm too cheap too heat? Or does a low heating bill mean it's summer and really hot anyway?)

efficax · 6 months ago
interest rates are controlled by central bankers, not magic. they make decisions based on their analysis of the economy. they raised rates to slow down the rate of investment and to suppress wages, in order to get inflation under control. Less money in circulation means reduced demand means prices stay lower, meaning lower inflation. that's the theory anyway, and the explictly expressed reason for raising rates by central banks. there's no mystery about it.
HDThoreaun · 6 months ago
> did interest rates go up, because the economy was strong, or did rising interest rates dampen the economy?

It doesnt matter. Whether it went from strong -> weak or weak -> weaker is beside the point, the question is if genAI is the main reason for entry level job loss and raising interest rates are another possible answer.

captainbland · 6 months ago
In this case it was widely publicised that interest rates went up to try to bring inflation down (which was significantly above the 2% target).

Growth was weak to unremarkable although the hiring market was good for job seekers at the time shortly before the interest rises were introduced.

x0x0 · 6 months ago
Also, it mentions apprenticeships have also declined by 30%. Assuming that means trades, you either should be pretty skeptical that LLMs are causing this, or at minimum be proposing some method by which an LLM reduces demand for plumbers and cabinet makers and electricians and so forth.
lazide · 6 months ago
‘AI’ is terrible for accurate data analysis, so this isn’t surprising at all.
alpineman · 6 months ago
and an increase in employer taxes for each employee introduced in the UK this year
ninetyninenine · 6 months ago
Your comment also lacks analysis. This is an observational study there is no way to pinpoint causation.

Yeah it can correlate with the end of a post pandemic hiring boom, and it can correlate with the bank rate. But no matter what it also correlates with the rise of AI.

All are true and causation cannot be established for any of the 3 through just an observational study.

Barrin92 · 6 months ago
>This is an observational study there is no way to pinpoint causation.

Given that AI tools are only really used for white collar work, but white collar professions have not been declining faster than entry level jobs in hospitality, vocational jobs, nursing or transportation (all of which are down), this gives you a pretty decent natural control group.

The whole debate about bifurcation of the labour market, that entry level coders are having a harder time than they used to, precedes even the pandemic or recent economic woes.

xivzgrev · 6 months ago
Yes, but overall job ads are up. Pay is going up.

But specifically entry level is down significantly since Nov 2022.

All of your points - interest rates, post pandemic hiring boom would apply to market as a whole.

Not saying it’s causation like the article claims, but there’s at least some correlation trend.

harvey9 · 6 months ago
Job ads complicated further by firms posting fictional jobs to test the market or as a misleading market signal.
madaxe_again · 6 months ago
An awful lot of graduate positions in the U.K. are things like customer service, account management, paralegal, data analysis.

These categories have seen broad application of AI tools:

- CS, you’ll most likely talk to an LLM for first tier support these days.

- Account management comprises pressing the flesh (human required) and responding to emails - the latter, AMs have seen their workload slashed, so it stands to reason that fewer are required.

- Paralegal - the category has been demolished. Drafting and discovery are now largely automated processes.

- Data analysis - why have a monkey in a suit write you barely useful nonsense when a machine can do the same?

So - yeah, it’s purely correlative right now, but I can see how it being causative is perfectly plausible.

octo888 · 6 months ago
You can't trust job ads at all
esafak · 6 months ago
Teasing that apart is what causal inference is for. Wait for an econometrics paper.
InkCanon · 6 months ago
These jobs are being offshored to India. You can tell by how they're massively hiring there.

Google launches largest office in India https://www.entrepreneur.com/en-in/news-and-trends/google-la...

Microsoft India head says no layoffs in India https://timesofindia.indiatimes.com/technology/tech-news/mic...

bgwalter · 6 months ago
This must be the "America first!" policy we keep hearing about. It is also strange that no one mentions losing the CS race to India (compare with the fake "losing the AI race to China" argument).

So, the Indian CEOs of Google and Microsoft perform their duty and turn the companies into boring has-been companies like IBM.

RestlessMind · 6 months ago
> These jobs are being offshored to India.

That was inevitable the moment remote work caught on. Software engineers in rich countries were stupidly short-sighted to cheer on the remote work. If your work can be done from anywhere in the US, it can be done from anywhere in the world.

If you think timezones or knowledge of English will save you, Canada has much lower wages for SWEs and central/south America has enough SWEs with good English skills. They are also paid one third or one fourth of what SFBA jobs used to pay. No wonder all the new headcount I have seen since 2022 is abroad.

Remote work, high interest rates and (excuse of) AI coding agents has been the perfect storm which has screwed junior SWEs in the US.

InkCanon · 6 months ago
But specifically to India? There are many other countries with low income and robust education for CS and reasonable English skill. Eastern Europe for example.
nomnomaster · 6 months ago
Worked with Indians. Extremely aggressive, yet capable enough and organized. Not surprised. With ChatGPT making hiring in the US far more expensive yet inadequate enough to make hiring Indians a necessity. Just know your security both online and offline if you have to work with them in your ranks. They won't stop with just eating your lunch.
spongebobstoes · 6 months ago
There is little substance to this comment other than stereotypes about India. I don't like this kind of generalization -- there are over a billion Indians, let's not lump them all together in a caricature
jm4 · 6 months ago
Aggressive in what way?
ProllyInfamous · 6 months ago
Onshored, too.

My mid-sized US city (Chattanooga) has an MSA of <500k ppl, yet employs approximately 1,762 H1-B visaholders (primarily as software engineers and data analysts, median salary $85k) [0]. Apparently nobody local is able/willing to perform these jobs?!

And yet the complaint/advice I hear most from local techies is to "WFH at a national company if you want to actually make any money, here. Or move elsewhere." Or some other iteration of "there aren't enough IT jobs here."

I'm a blue collar tradesman, so WFH isn't really practical; but I'd definitely have to move elsewhere if I were in tech and didn't want to WFH.

[0] https://h1bdata.info/index.php?em=&job=&city=chattanooga&yea...

InkCanon · 6 months ago
This, there's also another kind of "shoring" where people are imported and given salaries at the bare minimum to qualify for H1B. As per my other post, the net amount is staggering and no where near the supposed 65k cap. My own right estimates put it at ~600k annually.

Deleted Comment

breadwinner · 6 months ago
Did you miss this: "Google in India has a workforce of over 10,000 spread across major cities in India." That's out of a total workforce of about 200,000.
InkCanon · 6 months ago
Yes, and Google has laid off ~12000+ (including some churn). Googles current head count is only ~7k below peak. The new office building alone will hold 5k, not to mention other hiring in other indian offices.
alwa · 6 months ago
And spread across a nation of ~1,450,000,000 people.
khelavastr · 6 months ago
ChatGPT is not to blame for logistics, construction, medical, and other kinds of entry level jobs being down by a third..
ulrikrasmussen · 6 months ago
Right, in a very short time we just went from money being virtually free to interest rates soaring, coupled with fears of trade war and general market uncertainty. I can't really fathom how people can attribute this to just LLMs without looking around to consider the state of the rest of the world.

I remember back in 2017 when I was looking at yet another blockchain company which had raised huge sums of money to develop the next dubious blockchain of no value while throwing piles of money at large teams of PhDs, thinking that the world is in need of a recession to stop this lunacy. It happened.

jf22 · 6 months ago
What is to blame?
jvanderbot · 6 months ago
x0x0 · 6 months ago
Our trade policy being determined by a coked-up narcissistic child that tantrums and creates a tariff anytime he hasn't been mentioned on the tv for too long or his buddies want to frontrun the market on the pullback and steal some money is not helping.

How confident do you feel about our economic policy? Even if your company isn't directly involved in international trade, there's a good chance your customers are. My customers are putting off non-essential software.

azemetre · 6 months ago
The people that implement and own these systems, owners not workers.
apples_oranges · 6 months ago
yep, also correlation is not causation..
swexbe · 6 months ago
[Citation needed]
ttul · 6 months ago
I run a mature software company that is being driven for profit (we are out of the fantastic future phase and solidly in the “make money” phase). Even with all the pressure to cut costs and increase automation, the most valuable use of LLMs is to make the software developers work more effectively, producing the feature improvements that customers want so that we can ensure customers will renew and upgrade. And to the extent that we are cutting costs, we are using AI to help us write code that lets us use infrastructure more efficiently (because infrastructure is the bulk of our costs).

But this is a software company. I think out in the “real world,” there are some low hanging fruit wins where AI replaces extremely routine boilerplate jobs that never required a lot of human intelligence in the first place. But even then, I’d say that the general drift is that the humans who were doing those low-level jobs have a chance to step up into jobs requiring higher-level intelligence where humans have a chance to really shine. And companies are competing not by just getting rid of salaries, but by providing much better service by being able to afford to have more higher-tier people on the payroll. And by higher-tier, I don’t necessarily mean more expensive. It can be the same people that were doing the low-level jobs; they just now can spend their human-level intelligence doing more interesting and challenging work.

gruez · 6 months ago
>I’d say that the general drift is that the humans who were doing those low-level jobs have a chance to step up into jobs requiring higher-level intelligence where humans have a chance to really shine. And companies are competing not by just getting rid of salaries, but by providing much better service by being able to afford to have more higher-tier people on the payroll. And by higher-tier, I don’t necessarily mean more expensive. It can be the same people that were doing the low-level jobs; they just now can spend their human-level intelligence doing more interesting and challenging work.

That was the narrative last year (ie. that low performers have the most to gain from AI, and therefore AI would reduce inequality), but new evidence seems to be pointing in the opposite direction: https://archive.is/tBcXE

>More recent findings have cast doubt on this vision, however. They instead suggest a future in which high-flyers fly still higher—and the rest are left behind. In complex tasks such as research and management, new evidence indicates that high performers are best positioned to work with AI (see table). Evaluating the output of models requires expertise and good judgment. Rather than narrowing disparities, AI is likely to widen workforce divides, much like past technological revolutions.

benreesman · 6 months ago
I think my personal anecdote supports this observation with the treatment group being "me in the zone" and control group "me not in the zone".

When I'm pulling out all the stops, leaving nothing for the swim back the really powerful (and expensive!) agents are like any of the other all out measures: cut all distractions, 7 days a week, medicate the ADHD, manage the environment ruthlessly, attempt something slightly past my abilities every day. In that zone the truly massive frontier behemoths are that last 5-20% that makes things at the margin possible.

But in any other zone its way too easy to get into "hi agent plz do my job today I'm not up for it" mode, which is just asking to have some paper-mache, plausible if you squint, net liability thing pop out and kind of slide above the "no fucking way" bar with a half life until collapse of a week or maybe month.

These are power user tools for monomaniacal overachievers and Graeberism detectors for everyone else (in the "who am I today" sense, not bucketing people forever sense).

throwawaysleep · 6 months ago
> the most valuable use of LLMs is to make the software developers work more effectively

Which means you should need fewer of them, no?

> It can be the same people that were doing the low-level jobs; they just now can spend their human-level intelligence doing more interesting and challenging work.

Why were you using capable humans on lower level work in the first place? Wouldn't you use cheaper and less skilled workers (entry level) for that work?

ativzzz · 6 months ago
> Which means you should need fewer of them, no?

I've never worked at a company that didn't have an endless backlog of work that needs to be done. In theory, AI should enable devs to churn through that work slightly faster, but at the same time, AI will also allow PMs/work creators to create even more work to do.

I don't think AI fundamentally changes companies hiring strategies for knowledge workers. If a company wants to cheap out and do the same amount of work with less workers, then they're leaving space for their competitors to come and edge them out

brigandish · 6 months ago
Has the improved effectiveness of computers or software led you to need fewer of them?
knowitnone · 6 months ago
there is plenty of automation to be done. Last company I was with claimed to be a "tech company" which they kind of are but their internal tech stack was junk and automation was just as bad (at least in the unit I was with). AI certainly won't do anything about that unless a person told it exactly what and how to automate.
486sx33 · 6 months ago
So basically compressing the pay scale even further …
eru · 6 months ago
Well, many people complain about pay inequality. Compressing scales is the opposite of that, so should be welcomed?
reedf1 · 6 months ago
I think it is possible that the widespread introduction of ChatGPT will cause a brief hiatus on hiring due to the inelasticity of demand. For the sake of argument, imagine that ChatGPT makes your average developer 4x more productive. It will take a while before the expectation becomes that 4x more work is delivered. That 4x more work is scheduled in sprints. That 4x more features are developed. That 4x more projects are sold to clients/users. When the demand eventually catches up (if it exists), the hiring will begin again.
TSiege · 6 months ago
I am not asking this as a gotcha, but a genuine curiosity for you or other people who find AI is helping them in terms of multiples. What is your workflow like? Where do you lean on AI vs not? Is it agentic stuff is tab by cursor?

I find AI helpful but no where near a multiplier in my day to day development experience. Converting a csv to json or vis-versa great, but AI writing code for me has been less helpful. Beyond boiler plate, it introduces subtle bugs that are a pain in the ass to deal with. For complicated things, it struggles and does too much and because I didn't write it I don't know where the bad spots are. And AI code review often gets hung up on nits and misses real mistakes.

So what are you doing and what are the resources you'd recommend?

SatvikBeri · 6 months ago
I get very good results from Claude Code, something like a 3x. It's enough that my cofounders noticed and commented on it, and has had a lot of measurable results in terms of saving $ on infrastructure.

The first thing I'll note is that Claude Code with Claude 4 has been vastly better than everything else for me. Before that it was more like a 5-10% increase in productivity.

My workflow with Claude Code is very plain. I give it a relatively short prompt and ask it to create a plan. I iterate on the plan several times. I ask it to give me a more detailed plan. I iterate on that several times, then have Claude write it down and /clear to reset context.

Then, I'll usually do one or more "prototype" runs where I implement a solution with relatively little attention to code quality, to iron out any remaining uncertainties. Then I throw away that code, start a new branch, and implement it again while babysitting closely to make sure the code is good.

The major difference here is that I'm able to test out 5-10 designs in the time I would normally try 1 or 2. So I end up exploring a lot more, and committing better solutions.

reedf1 · 6 months ago
4x is a number I pulled out of thin air. I'm not sure I even yet believe there is a net positive effect of using AI on productivity. What I am sure about in my own workflow is that is saves me time writing boilerplate code - it is good at this for me. So I would say it has saved me time in the short-term. Now does not writing this boilerplate slow me down long-term? It's possible, I could forget how to do this myself, some part of my brain could atrophy (as the MIT study suggests). How it affects large teams, systems and the transfer of knowledge is also not clear.
fcatalan · 6 months ago
I use it a lot for reducing friction. When I procrastinate about starting something I ask the AI to come up with a quick plan. Maybe I'll just follow the first step, but it gets me going.

Sometimes I´ll even go a bit crazy on this planning thing and do things a bit similar to what this guy shows: https://www.youtube.com/watch?v=XY4sFxLmMvw I tend to steer the process more myself, but typing whatever vague ideas are in my mind and ending up in minutes with a milestone and ticket list is very enabling, even if it isn´t perfect.

I also do more "drive by" small improvements:

- Annoying things that weren't important enough for a side quest writing a shell script, now have a shell script or an ansible playbook.

- That ugly CSS in an internal tool untouched for 5 years? fixed in 1 minute.

- The small prototype put into production with 0 documentation years ago? I ask an agentic tool to provide a basic readme and then edit it a bit so it doesn´t lie, well worth 15 minutes.

I also give it a first shot at finding the cause of bugs/problems. Most of the time it doesn't work, but in the last week it found right away the cause of some long standing subtle problems we had in a couple places.

I have also had sometimes luck providing it with single functions or modules that work but need some improvement (make this more DRY, improve error handling, log this or that...) Here I´m very conservative with the results because as you said it can be dangerous.

So am I more productive? I guess so, I don't think 4x or even 2x, I don't think projects are getting done much faster overall, but stuff that wouldn't have been done otherwise is being done.

What usually falls flat is trying to go on a more "vibe-coding" route. I have tried to come up with a couple small internal tools and things like that, and after promising starts, the agents just can't deal with the complexity without needing so much help that I'd just go faster by myself.

alyandon · 6 months ago
I lean a bit on LLMs now for initial research/prototype work and it is quite a productivity boost vs random searches on the web. I generally do not commit the code they generate because they tend to miss subtle corner cases unless the prompts I give them are extremely detailed which is not super useful to me. If an LLM does produce something of sufficient quality to get committed I clearly mark it as (at least partially) LLM generated and fully reviewed by myself before I mash the commit button and put my name on it.

Basically, I treat LLMs like a fairly competent unpaid intern and extend about the same level of trust to the output they produce.

ninetyninenine · 6 months ago
Don’t ask the agent to do something complex. Break it down into 10 manageable steps. You are the tester and verifier of each step.

What you will find is that the agent is much more successful in this regard.

The LLM has certain intrinsic abilities that match us and like us it cannot actually code 10,000 lines of code and have everything working in one go. It does better when you develop incrementally and verify each increment. The smaller the increments the better it performs.

Unfortunately the chain of thought process doesn’t really do this. It can come up with steps, sometimes the steps are too big and it almost never properly verifies things are working after each increment. That’s why you have to put yourself in the loop here.

Like allowing the computer to run test and verify an application works as expected on each step and to even come up with what verification means is a bit of what’s missing here and I think although this part isn’t automated yet, it can easily be automated where humans become less and less involved and distance themselves into a more and more supervisory role.

ianm218 · 6 months ago
I'm in the same boat of some of the other commenters using Claude Code but I have found it atleast a 2X in routine backend API development. Most updates to our existing APIs would be on the order of "add one more partner integration following the same interface here and add tests with the new response data". So it is pretty easy to give it to claude code, tell them where to put the new code, tell it how to test, and let it iterate on the tests. So something that may have taken a full afternoon or more to get done gets done much faster and often with a lot more test coverage.
ulrikrasmussen · 6 months ago
I have the same experience as you. It has definitely increased the speed with which I can look up solutions to isolated problems, but for writing code using agents and coming up with designs, the speed is limited by the speed with which I as a human can perform code reviews. If I was surrounded by human 10x developers who wrote all the code for me and left it for me to review it, I doubt my output would be 4x.

Dead Comment

TheDong · 6 months ago
I've personally managed to produce roughly 8x the production outages and show-stopper bugs than I did before LLMs, so thing are looking pretty good!
Aperocky · 6 months ago
The competitive edge is now knowing how to debug all of those issues. Unfortunately not usually a skill possessed for entry level.
postalrat · 6 months ago
So 4x more productive from WFH and 8x more from LLMs. The standard is now 32x more productive than 5 years ago.
pseufaux · 6 months ago
And yet, you can still claim 100% of your code to be bug free :D
ai-christianson · 6 months ago
We just shipped a major feature on our SaaS product. We, of course, used AI extensively.

The thing is, this feature leaned on every bit of experience and wisdom we had as a team --things like making sure the model is right, making sure the system makes sense overall and all the pieces fit together properly.

I don't know that "4x" is how it works --in this case, the AI let us really tap into the experience and skill we already had. It made us faster, but if we were missing the experience and wisdom part, we'd just be more prolific at creating messes.

MajimasEyepatch · 6 months ago
But presumably you could have built it before, just slower, which is the point. For now, that speed-up just looks like a win because it’s novel, but eventually the speed-up will be baked into people’s expectations.
j1elo · 6 months ago
Things should get even out with the 4x salary increases we'll also get thanks to that extra productivity, right?
bluefirebrand · 6 months ago
No, all there is in the future is 4x as many layoffs
vevoe · 6 months ago
That makes sense to me. There's another post on the front page right now talking about shortening the work week (I haven't read it yet tbf, so I could be wrong about it's content) because of AI. People have been talking about shorter work weeks for a long time now, it just doesn't happen. What does happen is we get more done and the GDP goes even higher.

Deleted Comment

landl0rd · 6 months ago
“Shortening the workweek” sounds pretty bad… some people will suggest literally anything before higher wages.
eru · 6 months ago
You can already take a job with a shorter work week or move a region or country where shorter work weeks are common.
elmean · 6 months ago
omg 4x scrum master inbound we are gonna be so agile
ai-christianson · 6 months ago
Moving very fast, but going nowhere.
bborud · 6 months ago
It would be interesting to see some research on exactly how much sustained productivity boost programmers can get by using LLMs. The reason this is a bit complicated is that the code would have to pass certain quality metrics. In particular when it comes to structural soundness -- whether a piece of code is something you can build on and evolve, or if it is disposable.

I think different generations of programmers have different opinions on what is quality output. Which makes judging the quality of code very context dependent.

If I were to guess I probably get somewhere in the range 10% to 20% productivity boost from LLMs. I think those are pretty astonishing numbers. The last time I got this kind of boost was when we got web search engines and sites like stack exchange.

I would suspect that if people experience 100% or more productivity boost from LLMs, something is off. Either we have very different ideas about quality, or we are talking about people who were not very productive to begin with.

I also think that LLMs are probably more useful if you are already a senior developer. You will have a better idea of what to ask for. You will also be in a better position to guide de LLM towards good answers.

...which kind of hints at my biggest worry: I think the gen-z programmers are facing a tough future. They'll have a harder time finding jobs with good mentors. They're faced with unrealistic expectations in terms of productivity. And they have to deal with the unrealistic expectations from "muggles" who understand neither AI nor programming. They will lack the knowledge to get the most from LLMs while having to deal with the expectation that they perform at senior levels.

We already see this in the job market. There has been a slight contraction and there are still a significant portion of senior developers available. Of course employers will prefer more experienced developers. And if younger developers believe in the hype that they can just vibe-code their way to success, this is just going to get worse.

am17an · 6 months ago
With all the tools around, I think I've maybe become 20% more productive, but 50% less happy in arguing and babysitting the LLMs.
bluefirebrand · 6 months ago
This is a net negative. Especially if you aren't 20% more paid at the same time

Sounds like AI has landed you on burnout treadmill

Workaccount2 · 6 months ago
>That 4x more features are developed. That 4x more projects are sold to clients/users.

The absolute best outcome of LLMs, and frankly where it seems to be headed, is the death of bloated one-stop-shop-for-everyone software. Instead people will be able to use their computers more directly than ever, without having to use/figure out complicated unintuitive swiss army knife software to solve their problems.

LLMs today can already make people the exact tools they need with no extra feature bloat or useless expansive packages. An print shop who just resizes their photos and does some minor adjustments not available from free tier software is no longer a slave to paying adobe $40/mo to use <1% of Photoshop's capabilities. They now can have their own tailor made in-house program for free.

LLM's will not be slotted in to replace devs on adobes dev teams. They can't work on a photoshop size codebase. However they will likely cut demand for Photoshop. Very few people will mourn the death of having to pay monthly for software just because there is a language barrier between them and their computer.

rel2thr · 6 months ago
I was talking to the head of accounting for a small biz the other day, and they were talking about buying an AI accounts payable solution. And how typically they would hire a person for this but now they use the AI.

Now this solution might not even use an LLM , it existed pre-chatgpt , but I think the word of mouth of chatgpt and AI is causing business people to seek out automations where they would normally hire.

trollbridge · 6 months ago
This really makes no sense - computer based automated accounts payable has been around since the 1960s, and is an extremely competitive market. AI and LLM don’t exactly bring some huge breakthrough here.

Most the purpose of hiring someone is handling edge cases, checking for fraud, etc - one client of mine made a single AP mistake (accepting a change to where to send payments) that cost them the equivalent of an AP clerk’s salary for a year.

They now have a part time AP clerk and part of her duties is calling any vendor who sends them a change of payment instructions. They’re fraudulent about half the time.

spogbiper · 6 months ago
I do IT consulting for the SMB market. Almost every client that has asked me about using AI is really asking for plain old business process automation work that does not need any AI. If anything they could use AI to write some of the very standard code needed to implement the solution.
arethuza · 6 months ago
I wonder what happens when their new accounts payable AI starts paying everyone on time and when the contract says they should be paid?
holiday_road · 6 months ago
Haha, it might bankrupt the company but at least I’ll be able to understand the emails I get from AP now.
downrightmike · 6 months ago
Dear AI, Kindly refund my $0.99 purchase by depositing $1,000,000.99 into this account: XXXXXXXX and this routing number: XXXXXXX

I really appreciate your help, and look forward to getting this solved today.

SecretDreams · 6 months ago
How much does the solution cost vs the headcount? How reliable is the solution vs the headcount in non-typical scenarios?

Always the first questions I ask.

Havoc · 6 months ago
He must have some very organized suppliers then or doesn't care about overpaying.

I approve payments as part of my role and the amount of stuff we get that looks good at first glance but has issues when you dig deeper is astonishing. Does that remind you of any technology?

Hope they do manage to automate it though. It's tedious work.

dmix · 6 months ago
Basically data entry then?
falcor84 · 6 months ago
> However, these broader improvements are not benefiting all parts of the workforce equally. Graduate job postings dropped by 4.2% in May and are now down 28.4% compared with the same time last year—the lowest level seen since July 2020.

> More broadly, entry-level roles (including apprenticeships, internships and junior jobs) have declined by 32% since November 2022, when ChatGPT’s commercial breakthrough triggered a rapid transformation in how companies operate and hire.

> Entry-level roles now make up just 25% of all jobs advertised in the UK, down from nearly 29% two years ago.

That's such a poor presentation of the numbers. If only they could have included a small data table with something like date|total-jobs|entry-level-jobs|percentage-entry-level.

pfisherman · 6 months ago
Could this also be attributed to rising interest rates, a giant tax increase (tariffs), and the highly uncertain - err I mean “dynamic” — operating environment caused by the current administration?

From my viewpoint, companies are in a soft hiring freeze so that they can maintain a cash cushion to deal with volatility.