AI isn't about jobs and efficiency, it's about having a stronger position over labor. AI is useful, in most cases this makes existing labor better. The number of jobs that can be actually automated by it is much lower than people percieve. But the narrative is what matters. The temporary displacement and uncertainty of labor is what matters. It creates a weaker position for labor.
This is an excellent point. It should also be noted that the people who want a better position over labor largely write mediocre memos, make bad decisions, barely listen in meetings, and slap together powerpoints for said meetings. As it turns out, AI can also automate the work of thousands of shitty execs and upper management. If only we could apply that pressure in their direction...
I mean there might be a market for an AI model which takes i.e. Redmine (or any project managing tool) and will direct developer to work on this or that task, automatically split tasks into smaller ones or request from stake holder to better explain what is requested on vague tasks and so on.
Total wages in the us is ~11T per year. Depress the labor force a little bit, force up unemployment a little bit, and halt wage growth and you are seeing trillions in annual savings for the bosses just in the US alone.
It doesn’t need to be able to entirely automate a job to replace jobs though. If it enables one worker to be 10 times as productive, then the company can hire 9 fewer people (depending on how the productivity of the position scales)
I haven’t seen this. There are more smaller companies than larger ones. At my company, as a manager, I encourage the use of AI because it appears to make developers about 10% more efficient, helps kickstart new projects, and improves job satisfaction by automating away some of the boring parts of development.
Perhaps at call centers and such you are correct, but your comment is as disingenuous as saying the compiler is about getting a stronger position over labor, or the expansion of included libraries, or faster microprocessors, or modern IDEs before AI. The march towards automation, efficiency, and automation in engineering never stops.
Every so often there is a massive leap which results in significant job losses, but that doesn’t mean it’s about labor. Was the release of AWS about labor? It destroyed many Silicon Valley companies as you could now do with $5k what previously took $200k.
Instead of the person writing the article, what about the companies laying people off. It looks much better to say "Costs savings for going to AI" versus "Economic uncertainty in future orders"
There is some intense FOMO right now. I work for a large SAAS company and our guidelines went from no AI to "Use AI for everything everywhere". This does not come from a position of understanding (the people in charge are the same), but rather a deep fear that we could fall behind. Its not rooted in tangible metrics.
How many of these jobs will stay gone? I feel like I've seen this over and over where a new advancement causes job loss then management realizes it wasn't all it was cracked up to be and need to hire again.
From what I can see in software space, any software-primary company that thinks AI means they can fire 3/4ths of their staff in the near future will get stomped in the market by their competitor who decided to take the 4x productivity improvement instead.
It may make working for a non-software-primary company as a programmer more risky. But then, "be a value provider, where execs can draw a fairly straight line from your contribution to revenue, and not a value consumer" is not new career advice.
I tried ChatGPT 5 and Claude for some medium, non-trivial coding tasks in Rust, and most of the time the code does not even compile. It may work better for other programming languages. However, this make me believe that unless you want basic small functions (like convert integer to string) done by AI, betting on it to replace SDEs is still risky, as of today. On the other hand, AI tech is progressing quickly.
It's not replacing jobs. It's deferring their creation until the executives finally realize they fucked up and you can't replace a human with a glorified search engine that spews complete bullshit 30 percent of the time.
I had a rather unsettling thought occur to me the other day.
There are many people in this world, to whom truth and reality are fungible, malleable, unimportant things in the face of their agendas. These same people are overrepresented at the highest rungs of the corporate ladder, in the highest echelons of the halls of power in our world.
For them, right 70% of the time is more than good enough, because that which is real has always taken a back seat to that which is expedient.
So what if the Plagiarism Engine spews Confidently Stated Bullshit 30% of the time? They themselves do that at least as often, and for far more than the cost of what LLM's demand per call. Besides, they're so used to "perception=reality" that they figure for the 30% Confidently Stated Bullshit, they can just paper over it, bully, disconcert and repeat until reality matches their expedient bullshit.
The sad thing is, so much of our economy and society is based on bullshit and scams now, they may not be wrong. In fact, they're probably right, for at least a significant percentage of the populace who believes chemtrails are making the frogs gay etc.
I suspect some very hard lessons will need to be learned from disciplines where rigor is required 100% of the time, else lives are lost, and lives will be lost, because the business idiots will continue to shoehorn this garbage in everywhere they possibly can, to sustain the hype cycle bullshit they're all cashing in on. Only once many lives have been lost, and the link conclusively drawn to the garbage spewed by these statistical wordcloud predictive autocorrect machines, then we might see a modicum of forceful pushback. But I ain't holding my breath
Is this worth >1 trillion in capex?
For reference, less than 100 billion have been invested into fusion energy — since 1950.
What does fusion energy have to do with labor cost?
Obviously AI is technology that has its own value.
But keeping the labor cost down by just a few percent (and handing that money to company profits) has tremendous leverage.
Perhaps at call centers and such you are correct, but your comment is as disingenuous as saying the compiler is about getting a stronger position over labor, or the expansion of included libraries, or faster microprocessors, or modern IDEs before AI. The march towards automation, efficiency, and automation in engineering never stops.
Every so often there is a massive leap which results in significant job losses, but that doesn’t mean it’s about labor. Was the release of AWS about labor? It destroyed many Silicon Valley companies as you could now do with $5k what previously took $200k.
Let's take the blame a bit farther back.
Instead of the person writing the article, what about the companies laying people off. It looks much better to say "Costs savings for going to AI" versus "Economic uncertainty in future orders"
This is marketing.
I can't tell how credible the claim is that "increased adoption of generative AI technologies by private employers led to more than 10,000 lost jobs".
How is that replacing?
Deleted Comment
It may make working for a non-software-primary company as a programmer more risky. But then, "be a value provider, where execs can draw a fairly straight line from your contribution to revenue, and not a value consumer" is not new career advice.
See the jobless recovery after 2008 and how long it took the economy to get back on track.
There are many people in this world, to whom truth and reality are fungible, malleable, unimportant things in the face of their agendas. These same people are overrepresented at the highest rungs of the corporate ladder, in the highest echelons of the halls of power in our world.
For them, right 70% of the time is more than good enough, because that which is real has always taken a back seat to that which is expedient.
So what if the Plagiarism Engine spews Confidently Stated Bullshit 30% of the time? They themselves do that at least as often, and for far more than the cost of what LLM's demand per call. Besides, they're so used to "perception=reality" that they figure for the 30% Confidently Stated Bullshit, they can just paper over it, bully, disconcert and repeat until reality matches their expedient bullshit.
The sad thing is, so much of our economy and society is based on bullshit and scams now, they may not be wrong. In fact, they're probably right, for at least a significant percentage of the populace who believes chemtrails are making the frogs gay etc.
I suspect some very hard lessons will need to be learned from disciplines where rigor is required 100% of the time, else lives are lost, and lives will be lost, because the business idiots will continue to shoehorn this garbage in everywhere they possibly can, to sustain the hype cycle bullshit they're all cashing in on. Only once many lives have been lost, and the link conclusively drawn to the garbage spewed by these statistical wordcloud predictive autocorrect machines, then we might see a modicum of forceful pushback. But I ain't holding my breath