This is the key part. I'm doing a part-time graduate degree at a major university right now, and it's fascinating to watch the week-to-week pressure AI is putting on the education establishment. When your job as a student is to read case studies and think about them, but Google Drive says "here's an automatic summary of the key points" before you even open the file, it takes a very determined student to ignore that and actually read the material. And if no one reads the original material, the class discussion is a complete waste of time, with everyone bringing up the same trite points, and the whole exercise becomes a facade.
Schools are struggling to figure out how to let students use AI tools to be more productive while still learning how to think. The students (especially undergrads) are incredibly good at doing as little work as possible. And until you get to the end-of-PhD level, there's basically nothing you encounter in your learning journey that ChatGPT can't perfectly summarize and analyze in 1 second, removing the requirement for you to do anything.
This isn't even about AI being "good" or "bad". We still teach children how to add numbers before we give them calculators because it's a useful skill. But now these AI thinking-calculators are injecting themselves into every text box and screen, making them impossible to avoid. If the answer pops up in the sidebar before you even ask the question, what kind of masochist is going to bother learning how to read and think?
In my first year of college my calculus teacher said something that stuck with me "you learn calculus getting cramps on your wrists", yeah, AI can help remember things and accelerate learning, but if you don't put the work to understand things you'll always be behind people that know at least with a bird eye view what's happening.
Depends. You might end up going quite far without even opening up the hood of a car even when you drive the car everyday and depend on it for your livelihood.
If you're the kind that likes to argue for a good laugh, you might say "well, I don't need to know how my car works as long as the engineer who designed it does or the mechanic who fixes it does" - and this is accurate but it's also accurate not everyone ended up being either the engineer or the mechanic. It's also untrue that if it turned out it would be extremely valuable to you to actually learn how the car worked, you wouldn't put in the effort to do so and be very successful at it.
All this talk about "you should learn something deeply so you can bank on it when you will need it" seems to be a bit of a hoarding disorder.
Given the right materials, support and direction, most smart and motivated people can learn how to get competent at something that they had no clue about in the past.
When it comes to smart and motivated people, the best drop out of education because they find it unproductive and pedantic.
My argument is that when you have at least a basic knowledge of how things work (be it as a musician, a mechanical engineer or a scientist) you are in a much better place to know what you want/need.
That said, smart and motivated people thrive if they are given the conditions to thrive, and I believe that physical interfaces have way less friction than digital interfaces, turning a knob is way less work than clicking a bunch of menus to set up a slider.
If I were to summarize what I think about AI it would be something like "Let it help you. Do not let it think for you"
My issue is not with people using AI as a tool, bit with people delegating anything that would demand any kind of effort to AI