Readit News logoReadit News
keiferski · 2 years ago
I generally find LLMs to be useful, but I think these kinds of proclamations are a bit overblown. Many students do not want the kind of introspective, solo learning experience that a chatbot on a laptop provides. They want a social experience or a physically active experience, not one that requires them to sit quietly in a chair for 8 hours a day. And honestly, a classroom of students staring at screens talking to AI bots sounds like a dystopia to me.

All of the best educational experiences I've had in high school and college came from a passionate teacher that conveyed excitement to the students. Or (especially in college-level philosophy courses) they were small discussion groups that went on for hours. Neither of these things is going to be replicated by talking to a chatbot and most of these applications are just further extensions of a (in my opinion) broken industrial educational model.

That aside, I do think LLMs could be useful as a supplementary tutor, especially when your teacher isn't up to par.

zozbot234 · 2 years ago
And yet, individual tutoring provides a two-sigma improvement in outcomes compared to a mass lecture. Besides, language generation can be just as helpful in group settings. It just becomes "let's see what the robot says and talk about it" instead of everyone talking to the robot on their own.
keiferski · 2 years ago
Huge assumption that talking to a chatbot = individual tutoring. People keep making this analogy and I don't buy it at all. Having a physical human being in front of you is a lot different than talking to your computer.
wesapien · 2 years ago
It's the class sizes. When I was growing up we would have close to 50 kids in class.
syntheweave · 2 years ago
Well, then just have the class communicate with the chatbot in study groups, taking turns to type, if "social" is what you're after. The way class time is currently used reflects the need to keep that grade level engaged with appropriate activities, and only as you get to the older ages does it converge on that kind of solo self-study. But it has to, eventually. You don't do research without figuring out how to do it. The flaws in leaning on LLMs are just another form of "precision vs accuracy" - GPT is always precise and not always accurate, just in the verbal domain instead of the numeric one. But we do have many tools like that in the numeric domain. The limitations can be learned.

If the LLM gives a solid three-star "fast food" education, that is actually considerably better than letting it all fall on the shoulders of the babysitters that currently serve in many classrooms.

niceice · 2 years ago
> not one that requires them to sit quietly in a chair for 8 hours a day

Sounds like the public school I went to... Sit in your assigned chair, don't talk, little physical activity. Prison for kids.

I'll take the AI tutor that can be tailored to each kid and in any environment.

jncfhnb · 2 years ago
> And honestly, a classroom of students staring at screens talking to AI bots sounds like a dystopia to me.

Would it be less dystopian if it were replaced with a hologram of a historical figure / scientist speaking as though an actual primary source on the matter but with the same LLM behind it?

keiferski · 2 years ago
Not really, because I don't think kids are going to find a piece of software to be a compelling teacher. Having a physical presence is important for physical beings like humans. My guess is, they'd just ignore or make fun of a hologram. The "appeal" of a historical figure doing the teaching is also not really compelling to anyone that isn't already interested in history.
sebastiennight · 2 years ago
Same-same. The problem with the ChatGPT-tutor future is not "having to read".

Also, I believe that having tens of thousands of teachers with their unions and quirks has its issues, but is a much more resilient system at the nation level than a centralized tutor system. (The centralized tutor is too easy to game, bias, get seeded with native ads, etc.)

jprete · 2 years ago
I get the idea, but the dystopian part is the students being continually isolated in their educational time with their only contact being an impersonal computer program (or something that fakes being personal, but is not actually free to be personal).

The computer dressing up in human form doesn't really change it.

colinrand · 2 years ago
I always get very skeptical with putting more technology in the classrooms (at least here in the US). The primary problem is funding and too many kids for a single teacher. Educational innovation comes in with a bang and out with a whimper when the study turns out to be flawed, often quite severely.

I'm sure LLMs can augment learning in some settings, esp in higher ed, but putting more computer time for kids learning basics (I mean K-8 mostly) I hope is handled more carefully than things like Quizlet...

lainga · 2 years ago
One of my high school teachers had a quote she was fond of repeating:

"Education requires three things: a teacher, a student, and a stump. Unwilling to change the first, unable to change the second, educators spend all their money on the third."

visarga · 2 years ago
> I'm sure LLMs can augment learning in some settings, esp in higher ed

LLMs are great explainers of known theory. They can adapt to the level of the user and illustrate with concrete examples. They have infinite patience, there is no pressure.

eropple · 2 years ago
> LLMs are great explainers of known theory.

When they don't lie. When they don't make up citations that don't exist.

I think GPT-4 is moderately useful for software development, and I use it daily, but I have to check even very minor things, and having ~25 years of programming experience means I can do that intuitively. A kid doesn't have that.

obscurette · 2 years ago
I don't think it can. I have some experience in teaching and one of the worst issues I initially had was that you can't believe how messed the knowledge can be in heads of the kids. They literally mess everything – topics, words, culture, personal experience. Everything. To teach kids effectively you have to have a lot of background knowledge – from their family background to the info from teacher in previous class and you have to adapt constantly.

People who are talking about automation of education are grown up people who think that all people think structurally and just lack the specific knowledge. Kids actually don't do it.

huytersd · 2 years ago
Honestly, I think the primary problem in the US is parents completely offloading the responsibility of educating their kids onto the teacher. You need to have the appropriate push and environment at home for a successful education. You would never see that in South or East Asia.
colinrand · 2 years ago
This is a very economically driven situation in the US. Upper middles load their kids up non stop with educational stuff outside of the schools, but the lower middle on down can't afford this (it's really expensive) and school often functions primarily as child care.
ktaube · 2 years ago
One nice thing with LLMs is that there are no "stupid questions". You can throw it whatever is on your mind and get some insights.
Someone1234 · 2 years ago
You're right; but it should be noted that that is a design choice not an inherent property of the technology.

Currently, the design goal isn't to make LLMs feel "lifelike." It is likely that lifelike LLMs will be released in the future, which could result in sarcastic or dismissive replies to poor questions or missing information.

jncfhnb · 2 years ago
Backwards. LLMs will naturally adapt lifelike patterns of their training data and are at least easy to prompt into behaving as such.

We tune them to specifically not do that.

lallysingh · 2 years ago
As this technology becomes cheaper and more specialized, this is really exciting. Imagine being able to ask your textbook specific questions about what you're trying to understand.
papichulo2023 · 2 years ago
Youtube was for many years the best place to find educational material. Sadly every year is harder to search for with all those "you may also like", "people who searched this also search this other irrelevant thing", "shorts".
warner25 · 2 years ago
I'm a PhD student, mostly off on my own doing dissertation research and writing at this point, and typically getting one hour per week one-on-one with my advisor to ask questions and get guidance. So I'm already in the habit of reading and experimenting as much as I can to figure things out for myself, leaving only the toughest questions for those meetings with my advisor.

Since he turned me on to using ChatGPT, however, I've found that I have fewer and fewer of those questions. ChatGPT seems to be able to answer most of them for me, instantly instead of waiting until next Monday afternoon, and probably better than my advisor could answer them off the top of his head.

With that said, most kids are not mature PhD students who have already "learned how to learn", know what they need to learn about, know when and how to verify things by consulting other sources, and have an intrinsic motivation to attain a deep understanding of things.

ImHereToVote · 2 years ago
I think the elephant in the room is that if LLM's improve at the same pace as diffusors. Then there won't be much to educate towards. What does a decade of agentic multimodal model progress look like?

https://rocketdash.io/midjourney-and-the-progress-of-ai-in-2...

danpalmer · 2 years ago
I disagree, I think very little of what is taught in school would be made unnecessary by LLMs.

We teach maths, but calculators didn't stop this. Even if LLMs can take a less structured question and call a calculator, it just means maths moves more high level, and to understand the goals students would still need a foundational understanding anyway.

We teach language for communication, and we'll still need to communicate. LLMs can write emails for us, but only if we can communicate what we want written. We need to be able to understand the output, decide what we want to change about it, and voice that back to the LLM. They're a time saver but don't really remove the necessity of the skills development in some sense.

We teach languages for communication with others. Translation apps haven't stopped this. Being able to speak to someone is still worth a lot. If this does get solved in the future it'll be by UX improvements not AI improvements.

We teach creative skills, but we know most students don't go into creative jobs. These skills are taught for pleasure, for cultural reasons, and to create well-rounded individuals.

We teach history, geography, and science, not so that students can regurgitate facts (even if that is what ultimately gets tested) but because we as a society believe it is important to understand and learn from what came before us and how the world around us works, and be able to ask the right questions about the world.

LLMs, if they continue to improve, might change how we assess students, and they might change the focus somewhat, just like the introduction of the ubiquitous home PC or the internet did, but I don't think they're going to eliminate most of what is taught in school.

Workaccount2 · 2 years ago
I think the fundamental question to be answered is "Will humans be needed to move society forward in the future?"

People stand behind "Sure, AIs can write emails, but they won't know what to write emails about". But to me that doesn't make sense. The holy grail of AI, what we are racing towards at a breakneck pace, what every company really wants, is the AI that knows what to write the email about...

OpenAI, Meta, Google, homebrew. None of them are trying to build a calculator. They are all trying to build a calculator user. It's a very big difference and their progress has been nothing short of remarkable and unexpectedly fast.

Telemakhos · 2 years ago
Early education especially is not something that AI can handle, because it isn't just about learning facts or processes. Take small motor skill acquisition, for example: you have kids play with play-dough, finger-paint, fold origami, play musical instruments, and learn various handwriting styles like block and cursive and calligraphic styles, all to teach kids how to move their hands in complex ways to achieve what their brains want. AI isn't going to do that for kids, and tablets seem to have made motor-skills acquisition worse since the range of movements needed for a tablet is much smaller than that needed for 3d manipulation of objects in the real world.

I don't think AI is going to excel in language education, either: I get bored chatting with a bot very quickly. Language education is all about keeping students engaged and communicating about tasks they actually care about. I don't see chatbots like ChatGPT doing that.

whywhywhywhy · 2 years ago
> We teach... we teach... we teach...

Reading this, not sure about anyone else's experience but very little written here reflects how any of those subjects were actually approached and you know just the assumption that because you have something called a "teacher" in a class called "x" means that the "teacher" actually taught "x", because yeah from my experience a lot of it was just about regurgitating facts, no creativity was never really taught and was often punished, actual speaking and actual communication was never really taught only judged, and no way did "learn from what came before us and how the world around us works" ever occur within the walls of any classroom I was unfortunate enough to sit within.

Wouldn't ever risk my kids future by throwing them into that system, honestly feel extremely lucky it didn't completely screw mine up.

jncfhnb · 2 years ago
Is it true that calculators did not reduce math education on tasks that the calculator automated? I find that dubious.
hashtag-til · 2 years ago
This really could be a plot for "Idiocracy 2", showing humanity hundreds of years in future, where all education was given up in favour of LLMs and a bunch of undergrads/startups.
ozr · 2 years ago
I'm not sure how this connects. Diffusion models are great for certain tasks, not for others. They have sort of a niche use in audio and are the SOTA for image generation, but there is a lot of education (almost all) that doesn't apply to those domains.

When we're talking about multimodal ML systems, the core is generally a transformer-based LLM with no diffusion component.

IanCal · 2 years ago
I think the point is that diffusion models improved at a very fast rate, how do we think things will look if LLMs (in whatever form) see the same rate of improvements?
xnx · 2 years ago
Related, Sal Khan of Khan Academy, "How AI could save (not destroy) education" https://www.ted.com/talks/sal_khan_how_ai_could_save_not_des...