The article sounds a cliche. The progression was always happening, nothing sudden. Just like the continuous movement of tectonic plates through the earth quakes. When tension between the plates reach a level, rupture happens. But it is not the rupture causing the tectonic movement. It is the opposite.
Things like electricity, computers, internet, smartphones and AI are those earthquakes caused by the tectonic movement towards dominance of the machine.
The goal of human progress was to make everything easier. Tools came up to augment human abilities, both physical and mental, so that humans can free themselves from all hard work of physical labor and thinking.
We do gym and sports as the body needs some fake activity to fool it into believing that we still need all that muscle strength. We might get some gym and sports for the mind too to give it some fake activity.
Remember, the goal is to preserve ourselves as physical beings while not really doing any hard work.
I know it might be an offensive way to put it, but I honestly believe if AI ends up making people no longer need to use their brains as much, it's a great thing.
Think about it: do we rather to live in a world where heavy labor is a necessity to make a living, or a world where we go to gym to maintain our physique?
If mental labor isn't (as) necessary and people just play Scrabble or build weird Rude Goldberg machines in Minecraft to keep their minds somewhat fit, is this future really that bleak?
It is quite bleak to me. Thinking has always been an important part of what makes us human, much more so than physical labor.
Craftsmanship and tool usage are physical activities that also define us as a species and you will find no shortage of people lamenting our loss of those skills, too. Both those and thinking are categorically different than water carrying, ditch digging, and other basic heavy labor.
> I know it might be an offensive way to put it, but I honestly believe if AI ends up making people no longer need to use their brains as much, it's a great thing.
I'd say it is a constant fight against laziness. Sure it is convenient to drive everywhere with a car but at some point you might understand that it makes more sense to walk somewhere once in a while or regularly. Sure, escalators are convenient but better take the stairs so you don't need to go to the gym and save some money. If you ask me should all do more physical labour and the same goes for the mental labor. If we give that up as well the future is really bleak, to answer your question.
The analogy here is probably physical exercise. Lack of exertion sounds great until your body falls apart and destroys itself without frequent exercise.
It is paramount to not ignore the state of the world. Poverty, wars, inequality in the distribution of resources, accelerated natural disasters, political instability… Those aren’t going to be solved by a machine thoughtlessly regurgitating words from random text sources.
Even if a world where people don’t use their brains were desirable (that’s a humungous if), the present is definitely not the time to start. If anything, we’re in dire need of the exact opposite: people using their brains to not be conned by all the bullshit being constantly streamed into our eyes and ears.
And in your world, what happens when a natural disaster which wasn’t predicted takes out the AI and no one knows how to fix it? Or when the AI is blatantly and dangerously wrong but no one questions it?
The study referenced shows a sudden and dramatic drop in brain activity of people using AI to write an essay when compared to people writing an essay themselves.
Writing teaches us to organize our thinking. Failing to learn to organize our thinking makes us dumb.
When questioned about the contents of the essay those who used chat gpt to assist were not able to answer any questions about the paper. As though it had never happened.
Imagine waking up one day and realizing this your senior year of college was over and you literally could not remember a single thing you learned that year, as though it never happened.
That’s the idiocratic shift we’re seeing. AI is literally causing us to turn off our brains. A whole generation will learn nothing in school.
Ask an educator how it’s going. I’ve heard from half a dozen. The consensus is “it’s a shit show”
It's always the people who have no clue what they're talking about, whether in physiology or philosophy, who insist on speaking so confidently on matters of physical activity and the meaning of life. No, your reductionary argument about the purpose of something like the gym should not be "remembered". People partake in labor and even seek hardships for many individual reasons, most of which do not involve "faking" anything at all. Keep inhaling the data-ist technobro coolaid though, it's awesome to be always online.
Are we living in a golden age of stupidity? hat depends on what we mean by "stupidity." If we mean:
Information overload, not matched with critical thinking; short attention spans, driven by algorithmic content; decline in deep reading, writing, and manual creativity
…then yes. There’s a legitimate case that we’re in a period of widespread mental passivity rather than active curiosity. Of course this isn’t a new phenomenon. Every generation feels the next is losing touch with something essential. What is different today is the scale and speed of digital influence.
> Gerlich recently conducted a study, involving 666 people of various ages, and found those who used AI more frequently scored lower on critical thinking. (As he notes, to date his work only provides evidence for a correlation between the two: it’s possible that people with lower critical thinking abilities are more likely to trust AI, for example.)
Key point. The top use case for "Artificial Intelligence" is lack of natural intelligence.
If ChatGPT generated the text, participants weren't encoding it into memory through the cognitive processes normally involved in writing; they were essentially passive recipients of AI output. Isn't it a trivial finding then, that participants could barely recall a text they didn't write? Also, the study is small (54 participants), not peer-reviewed, and conflates two different issues: the cognitive effort during task completion versus memory retention afterwards.
Doesn't the decline of IQ rather correlate with smartphone ubiquity, particularly after 2010, and the steepest declines appear in 18-22 year-olds—the heaviest smartphone users? Multiple studies link smartphone addiction specifically to reduced cognitive abilities, not technology broadly.
I get your point, but I confess I have sometimes had to pause for some time to decide if I was holding a recyclable, a compostable or landfill — looking at the little pictures in fact, hoping I can find the thing I am holding.
Yeah, but otherwise, the whole MIT Media Lab thing is increasingly tasting a little bitter, not the glamorous, enviable place it seemed like in decades past.
Rather than looking for the next internet-connected wearable, for some reason, increasingly, I keep thinking about Bruce Dern's character in the film Silent Running.
It's much worse in South Korea, I think there were at least 5 different bins with different signs. Most things you bought had a label on them and you could try to match the letters to what was on the bin. Except it wasn't perfectly matched up, and we didn't have some bins with signs that matched up with whatever was written on the package.
I eventually gave up and only ate to avoid having to deal with it.
This is one place where I think our friends the magic robots might actually be useful (though it's more CV than LLM stuff); people are really _amazingly_ bad at this, and will happily ignore a printed sign, and even quite low accuracy would probably be better than what happens now, which isn't that far off random.
Yes but…
As an example In some cities, the signs specifying whether parking is allowed can be impossible to decipher. Sometimes feels like an AI would be needed to tell you “can I park this particular vehicle here right now, and for how long?”
Not that I’d trust an AI to get it right - but people already don’t.
> The fundamental issue, Kosmyna says, is that as soon as a technology becomes available that makes our lives easier, we’re evolutionarily primed to use it.
> “Our brains love shortcuts, it’s in our nature. But your brain needs friction to learn. It needs to have a challenge.”
"Our brains needs friction to learn" is a good way of summarizing the fundamental problem.
Yes, shortcuts can be great, but they also obviously stop you from actually learning. The question then becomes: Is that a bad thing? Or is the net results positive?
My guess is that "it depends" on the tasks and the missed learning. But losing things like critical thinking, the ability to learn and concentrate could be catastrophic for society and bad for individuals. And maybe we're already seeing the problems this creates in society.
This sounds nice, but what I've run into is that the model fails to write changes if the code has changed under it. A better tool, where it takes a snapshot at the start of each non-interactive segment, and then resolves merge conflicts with my manual changes automatically, would make this much easier.
The worst part is when you find out your vibe coded stuff didn't actually work properly in production and you introduced a bug while being lazy. It's really easy to do.
That is both a sweeping generalization and plainly wrong. The "much earlier" days of programming had blazing fast compilers, like Turbo Pascal. The "earlier" days had C compiler that were plenty fast. Only languages like C++ had this kind of problem.
Worst offenders like Rust are "today", not "earlier".
I used to work on a project that could take 30 mins+ to compile the entire project.
Nearly every time, your problems were detected _early_ in the process. Because build systems exist, they don't take 30 minutes on average. They focus on what's changed and you'll see problems instantly.
It's _WAY_ more efficient for human attentional flow than waiting for AI to reason about some change while I tap my fingers.
If you need an attention sink, try chess! Pick a time control if it's over 2 minutes of waiting, and do puzzles if it's under. I find that there's not much of a context switch when I get back to work.
I'm having the same problem. LLMs really take me out of the task mentally. It feels like studying as a kid. I need to really make a concerted effort; the task is no longer engaging on its own.
As someone with focus problems, I find it more productive to have a conversation with ChatGPT (or Claude) about code. And avoid letting it make major changes. And hand code a lot with Copilot.
How slow the AIs respond provides some opportunity to work on two task at once. Things like investigate a bug, think about the implementation for something larger, edit code that experience has told you it will take just as much or more typing to have the LLM do it.
It's less cool than having a future robot do it for you while you relax, but if you enjoy programming it brings some of the joy back.
They're not that slow! You want me to believe we've gone from programmers being so fragile that disturbing their 'flow state' will lose them hours of productivity, to programmers being the ultimate multitaskers who can think and code while their LLM takes 10 seconds to respond? /s
Until recently every technological advancement replaced manual work, like in agriculture, transportation, industry. Even the tiniest car amenity, like electric windows, hydraulic breaks or touch screen entertainment is aiming to replace a limb movement. With AI it is the first time the tech offloads directly cognitive tasks, leading inevitably to mental atrophy. The hopeful scenario is to repurpose the brain for new activities and not rotting, like replacing labor work gives the opportunity for sports and not getting fat.
Things like electricity, computers, internet, smartphones and AI are those earthquakes caused by the tectonic movement towards dominance of the machine.
The goal of human progress was to make everything easier. Tools came up to augment human abilities, both physical and mental, so that humans can free themselves from all hard work of physical labor and thinking.
We do gym and sports as the body needs some fake activity to fool it into believing that we still need all that muscle strength. We might get some gym and sports for the mind too to give it some fake activity.
Remember, the goal is to preserve ourselves as physical beings while not really doing any hard work.
The Factorio devs are ahead of the curve on that front I guess.
Think about it: do we rather to live in a world where heavy labor is a necessity to make a living, or a world where we go to gym to maintain our physique?
If mental labor isn't (as) necessary and people just play Scrabble or build weird Rude Goldberg machines in Minecraft to keep their minds somewhat fit, is this future really that bleak?
Craftsmanship and tool usage are physical activities that also define us as a species and you will find no shortage of people lamenting our loss of those skills, too. Both those and thinking are categorically different than water carrying, ditch digging, and other basic heavy labor.
This _might, arguably_ be true, if the output was as good as their output would have been. Very limited evidence right now, but what there is is largely not promising, eg https://www.theregister.com/2025/09/04/m365_copilot_uk_gover...
If we end up with a society of people not thinking and producing nonsense, that seems fairly concerning.
Even if a world where people don’t use their brains were desirable (that’s a humungous if), the present is definitely not the time to start. If anything, we’re in dire need of the exact opposite: people using their brains to not be conned by all the bullshit being constantly streamed into our eyes and ears.
And in your world, what happens when a natural disaster which wasn’t predicted takes out the AI and no one knows how to fix it? Or when the AI is blatantly and dangerously wrong but no one questions it?
Imagine waking up one day and realizing this your senior year of college was over and you literally could not remember a single thing you learned that year, as though it never happened. That’s the idiocratic shift we’re seeing. AI is literally causing us to turn off our brains. A whole generation will learn nothing in school. Ask an educator how it’s going. I’ve heard from half a dozen. The consensus is “it’s a shit show”
Information overload, not matched with critical thinking; short attention spans, driven by algorithmic content; decline in deep reading, writing, and manual creativity
…then yes. There’s a legitimate case that we’re in a period of widespread mental passivity rather than active curiosity. Of course this isn’t a new phenomenon. Every generation feels the next is losing touch with something essential. What is different today is the scale and speed of digital influence.
Key point. The top use case for "Artificial Intelligence" is lack of natural intelligence.
PS Cute choice of sample size.
Maybe both are correct because most people are not using AI to generate their next SAAS passive income whatever.
>top use case for "Artificial Intelligence" is lack of natural intelligence.
Also true if you think about a situation where there is just not enough natural intelligence to accomplish something within its scope.
Maybe there never was enough natural intelligence for something or other, or maybe not enough any more.
It could be a lot more acceptable to settle for artificial in those cases more so than average, especially if there is a dire need,
But first you have to admit the dire lack of natural intelligence :\
So all we need is a ban on every other programmer's employment of it.
I'll wait :)
Doesn't the decline of IQ rather correlate with smartphone ubiquity, particularly after 2010, and the steepest declines appear in 18-22 year-olds—the heaviest smartphone users? Multiple studies link smartphone addiction specifically to reduced cognitive abilities, not technology broadly.
A printed sign can do the same.
Try harder, A"I".
Yeah, but otherwise, the whole MIT Media Lab thing is increasingly tasting a little bitter, not the glamorous, enviable place it seemed like in decades past.
Rather than looking for the next internet-connected wearable, for some reason, increasingly, I keep thinking about Bruce Dern's character in the film Silent Running.
I eventually gave up and only ate to avoid having to deal with it.
Not that I’d trust an AI to get it right - but people already don’t.
In UK, works as designed... to maximise penalty earnings.
> “Our brains love shortcuts, it’s in our nature. But your brain needs friction to learn. It needs to have a challenge.”
"Our brains needs friction to learn" is a good way of summarizing the fundamental problem.
Yes, shortcuts can be great, but they also obviously stop you from actually learning. The question then becomes: Is that a bad thing? Or is the net results positive?
My guess is that "it depends" on the tasks and the missed learning. But losing things like critical thinking, the ability to learn and concentrate could be catastrophic for society and bad for individuals. And maybe we're already seeing the problems this creates in society.
I'm left wondering whether I should have just hand-coded what I was doing a bit slower, but kept my attention focused on the task
I like to fire the model off to do exploratory implementations as I refine the existing work.
The earlier days of programming had more "blocking" since compilation was quite slow. So the issue is obviously that "blocking", but social media.
Worst offenders like Rust are "today", not "earlier".
Nearly every time, your problems were detected _early_ in the process. Because build systems exist, they don't take 30 minutes on average. They focus on what's changed and you'll see problems instantly.
It's _WAY_ more efficient for human attentional flow than waiting for AI to reason about some change while I tap my fingers.
It's less cool than having a future robot do it for you while you relax, but if you enjoy programming it brings some of the joy back.
"What was I doing again!?" is a big problem
Deleted Comment
That’ll likely degenerate into “I want my AI to do dishes and laundry so I can code, not code so I can do my dishes and laundry”