Paper:
https://nanda.media.mit.edu/ai_report_2025.pdf
https://web.archive.org/web/20250818145714/https://nanda.media.mit.edu/ai_report_2025.pdf
https://nanda.media.mit.edu/ai_report_2025.pdf
https://web.archive.org/web/20250818145714/https://nanda.media.mit.edu/ai_report_2025.pdf
Oof
To be fair, the PowerPoint they were shown at that AI Synergies retreat probably was very slick.
It's almost like, and stay with me here, but it's almost like the vast majority of tech companies are now run by business graduates who do not understand tech AT ALL, have never written a single line of code in their lives, and only know how to optimize businesses by cutting costs and making the products worse until users revolt.
It is because they think it will 10x their chances of getting a really good engineer for 1/10th as cheap.
At least that is my theory. maybe i am wrong. i try to be charitable.
The initial years of adopting new tech have no net return because it's investment. The money saved is offset by the cost of setting up the new tech.
But then once the processes all get integrated and the cost of buying and building all the tech gets paid off, it turns into profit.
Also, some companies adopt new tech better than others. Some do it badly and go out of business. Some do it well and become a new market leader. Some show a net return much earlier than others because they're smarter about it.
No "oof" at all. This is how investing in new transformative business processes works.
Many new ideas came through promising to be "transformative" but never reached anywhere near the impact that people initially expected. Some examples: SOA, low-code/no-code, blockchain for anything other than cryptocurrency, IoT, NoSQL, the Semantic Web. Each of these has had some impact, but they've all plateaued, and there are very good reasons (including the results cited in TA) to think GenAI has also plateaued.
My bet: although GenAI has plateaued, new variants will appear that integrate or are inspired by "old AI" ideas[0] paired with modern genAI tech, and these will bring us significantly more intelligent AI systems.
[0] a few examples of "old AI": expert systems, genetic algorithms, constraint solving, theorem proving, S-expression manipulation.
> GenAI has been embedded in support, content creation, and analytics use cases, but few industries show the deep structural shifts associated with past general-purpose technologies such as new market leaders, disrupted business models, or measurable changes in customer behavior.
They are not seeing the structural "disruptions" that were present for previous technological shifts.
What are you talking about? The return on investment from computers was immediate and extremely identifiable. For crying out loud "computers" are literally named after the people whose work they automated.
With Personal Computers the pitch is similarly immediate. It's trivial to point at what labour VisiCalc automated & improved. The gains are easy to measure and for every individual feature you can explain what it's useful for.
You can see where this falls apart in the Dotcom Bubble. There are very clear pitches; "Catalogue store but over the internet instead of a phone" has immediately identifiable improvements (Not needing to ship out catalogues, being able to update it quickly, not needing humans to answer the phones)
But the hype and failed infrastructure buildout? Sure, Cisco could give you an answer if you asked them what all the internet buildout was good for. Not a concrete one with specific revenue streams attached, and we all know how that ends.
The difference between Pets.com and Amazon is almost laughably poignant here. Both ultimately attempts to make the "catalogue store but on the computer" work, but Amazon focussed on broad inventory and UX. They had losses, but managed to contain them and became profitable quickly (Q4 2001). Amazon's losses shrank as revenue grew.
Pets.com's selling point was selling you stuff below cost. Good for growth, certainly, but this also means that their losses grew with their growth. The pitch is clearly and inherently flawed. "How are you going to turn profitable?" We'll shift into selling less expensive goods "How are you going to do that?" Uhhh.....
...
The observant will note: This is the exact same operating model of the large AI companies. ChatGPT is sold below unit cost. Claude is sold below unit cost. Copilot is sold below unit cost.
What's the business pitch here? Even OpenAI struggles to explain what ChatGPT is actually useful for. Code assistants are the big concrete pitch and even those crack at the edges as research after research shows the benefits appear to be psychosomatic. Even if Moore's law hangs on long enough to bring inference cost down (nevermind per-task token usage skyrocketing so even that appears moot), what's the pitch. Who's going to pay for this?
Who's going to pay for a Personal Computer? Your accountant.
It's pay big tech or fall behind.
Deleted Comment
It also improves brand reputation by actually paying attention to what customers are saying and responding in a timely manner, with expert-level knowledge, unlike typical customer service reps.
I've used LLMs to help me fix Windows issues using pretty advanced methods, that MS employees would have just told me to either re-install Windows or send them the laptop and pay $hundreds.
I immediately hop on customer service chat to ask for a refund. I was surprised to be talking to an LLM rather than a human, but go ahead and explain what happened and state I want the transaction for the subscription canceled. It offers to cancel the subscription at the end of the 30-day subscription. I decline, noting I want a refund for the subscription I didn't intend to take. It repeats it can cancel the subscription at the end of 30-day subscription. I ask for human. It repeats. I ask for human again. It repeats. I disconnect.
Amazon knows what it's doing.
If Amazon wanted to give you the ability to get a refund for unused Prime benefits, it would allow the AI to do it, or even give you a button to do it yourself.
All my interactions with any AI support so far is repeatedly saying "call human" until it calls human
Customer support is when all the documentation already failed and you need a human.
Dead Comment
99% seems like a pulled-out-of-your-butt number and hyperbolic, but, yes, there's clearly a non-trivial percentage of customer support that's absolutely terrible.
Please keep in mind, though, that a lot of customer support by monopolies is intended to be terrible.
AI seems like a dream for some of these companies to offer even worse customer service, though.
Where customer support is actually important or it's a competitive market, you tend to have relatively decent customer support - for example, my bank's support is far from perfect, but it's leaps and bounds better than AT&T or Comcast.
I don't agree. AI support is as useless as real customer support. But it is more polite, calm, with clear voice, etc. Much better, isn't it?
Cancel account- have them call someone.
Withdraw too much - make it a phone call.
Change their last name? - that would overwhelm our software, let’s have our operator do that after they call in.
Etc.
That doesn't make much sense. Either your system can handle it or it can't. Putting a support agent in front isn't going to change that.
That is because search is still mostly stuck in ~2003. But now ask the exact same thing of an LLM and it will generally be able to provide useful links. There's just so much information out there, but search engines just suck because they lack any sort of meaningful natural language parsing. LLMs provide that.
But I can’t imagine ever calling tech support for help unless it is more than troubleshooting and I need them to actually do something in their system or it’s a hardware problem where I need a replacement.
I.e. AI isn't allowed to offer me a refund because my order never arrived. For that, I have to spend 20 minutes on the phone with Mike from India.
Perhaps there is a group that isn’t served by legacy ui discovery methods and it’s great for them, but 100% of chat bots I’ve interacted with have damaged brand reputation for me.
The trouble is when they gatekeep you from saying "I know what I'm doing, let me talk to someone"
i have yet to experience this. unfortunately i fear it's the best i can hope for, and i worry for those in support positions.
AI is not better than a good customer service team, or even an above-average one. It is better than a broken customer service team, however. As others have noted, 99% is hyperbolic BS.
Then why hasn't it yet? In fact, some lower-wage countries such as China are on the forefront of industrial automation?
I think the bottom line is that many Western countries went out of their way to make manufacturing - automated or not - very expensive and time-consuming to get off the ground. Robots don't necessarily change that if you still need to buy land, get all the permits, if construction costs many times more, and if your ongoing costs (energy, materials, lawyers, etc) are high.
We might discover that AI capacity is easier to grow in these markets too.
Because the current companies are behind the curve. Most of finance still runs on Excel. A lot of other things, too. AI doesn't add much to that. But the new wave of Tech-first companies now have the upper hand since the massive headcount is no longer such an advantage.
This is why Big Tech is doing layoffs. They are scared. But the traditional companies would need to redo the whole business and that is unlikely to happen. Not with the MBAs and Boomers running the board. So they are doing the old stupid things they know, like cutting costs by offshoring everything they can and abusing visas. They end up losing knowledgeable people who could've turned the ship around, the remaining employees become apathetic/lazy, and brand loyalty sinks to the bottom. See how S&P 500 - top 10 is flat or dumping.
If only because someone else has to build all the nuclear reactors that supply the data centers with electricity. /s
Jobs like customer/tech support aren't uniquely suited to outsourcing. (Quite the opposite; People rightfully complain about outsourced support being awful. Training outsourced workers on the fine details of your products/services & your own organisation, nevermind empowering them to do things is much harder)
They're jobs that companies can neglect. Terrible customer support will hurt your business, but it's not business-critical in the way that outsourced development breaking your ability to put out new features and fixes is.
AI is a perfect substitute for terrible outsourced support. LLMs aren't capable of handling genuinely complex problems that need to be handled with precision, nor can they be empowered to make configuration changes. (Consider: Prompt-injection leading to SIM hijacking and other such messes.)
But the LLM can tell meemaw to reset her dang router. If that's all you consider support to be (which is almost certainly the case if you outsource it), then you stand nothing to lose from using AI.
Deleted Comment
If we project long term, could this mean that countries with the most capital to invest in AI and robotics (like the U.S.) could take back manufacturing dominance from countries with low wages (like China)?
The idea that China is a low wages country should just die. It was the case 10y ago, not anymore.
Some part of China have higher average salaries than some Eastern European countries.
The chance of a robotic industry in the US moving massively jobs from China only due to a pseudo A.I revolution replacing low paid wages (without other external factors, e.g tarifs or sanctions) is close to 0.
Now if we do speak about India and the low skill IT jobs there. The story is completely different.
Tim Cook explains it better that I could ever do:
https://www.youtube.com/watch?v=2wacXUrONUY
The reason US manufacturers aren’t interested in taking small volume low cost orders is that they have more than enough high margin high quality orders to deal with. Even the small-ish machine shop out in the country near the farm fields by some of my family’s house has pivoted into precision work for a big corporation because it pays better than doing small jobs
And the idea that China has low wages is outdated. Companies like Apple don't use China for its low wages, countries like Vietnam have lower wages. China's strength lies in its manufacturing expertise
Deleted Comment
That would explain a lot, actually. If so, it'll be interesting to see what happens to the overall software economy when that revenue stream dries up. My wife grew up in Mexico on a border town and told me that the nightclubs in her town were amazing; when she moved to the US, she was disappointed by how drab the nightclubs here were. Later she found out that the border town nightclubs were so extravagant because they were laundering drug money. When they cracked down on the money laundering, the nightclubs reverted back to their natural "drab" state of relying on actual customers to pay the bills.
Deleted Comment
It may just be incompetence in large organisations though. Things get outsourced because nobody wants to manage them.
Original title "AI is already displacing these jobs" tweaked using context from first paragraph to be less clickbaity.
At my job, thanks to AI, we managed to rewrite one of our boxed vendor tools we were dissatisfied with, to an in-house solution.
I'm sure the company we were ordering from misses the revenue. The SaaS industry is full of products whose value proposition is 'it's cheaper to buy the product from us than hire a guy who handles it in house'
There are projects I lead now that I would have at least needed one or maybe two junior devs to do the grunt work after I have very carefully specified requirements (which I would have to do anyway) and diagrams and now ChatGPT can do the work for me.
That’s never been the case before and I’ve personally gone from programming in assembly, to C, to higher level languages and on the hardware side, personally managing the build out of a data center that had an entire room dedicated to a SAN with a whopping 3TB of storage to being able to do the same with a yaml/HCL file.
I remember Bill Gates once said (sometime in the 2000s) that his biggest gripe, is during his decades in the software industry, despite dramatic improvements in computing power and software tools, there has only been a modest increase in productivity.
I started out programming in C for DOS, and once you got used to how things were done, you were just as productive.
The stuff frameworks and other stuff help with, is 50% of the job at max, which means due to Amdahls law, productivity can at most double.
In fact, I'd argue productivity actually got reduced (comparing my output now, vs back then). I blame this on 2 factors:
- Distractions, it's so easy to d*ck around the internet, instead of doing what you need to do. I have a ton of my old SVN/CVS repos, and the amount of progress I made was quite respectable, even though I recall being quite lazy.
- Tooling actually got worse in many ways. I used to write programs that ran on the PC, you could debug those with breakpoints, look into the logs as txt, deployment consisted of zipping up the exe/uploading the firmware to the uC. Nowadays, you work with CI/CD, cloud, all sorts of infra stuff, debugging consists of logging and reading logs etc. I'm sure I'm not really more productive.
Deleted Comment
One example I mentioned is SaaS whose value proposition is that it's cheaper than to hire a dedicated guy to do it - if AI can do it, then that software has no more reason to exist.
You might well see more software profits if costs go down but less revenue. Depends on Jevon's paradox really
It worked out pretty well. Who knows how the software engineering landscape will change in 10 to 20 years?
I enjoyed Andrej Karpathy's talk about software in the era of AI.
https://www.youtube.com/watch?v=LCEmiRjPEtQ
But it does make sense on a superficial level at least: why pay a six-pack of nobodies half-way 'round the world to.. use AI tools on your behalf? Just hire a mid/senior developer locally and have them do it.
Deleted Comment