Neither are most AI related terms. People talk about learning and intelligence and thinking. If it remained a technical term and everyone understood "not really learning..." its fine but it gets out into the wild and man it's not learning it's doing word math and stringing words and bits of words together.
I enjoy using it for programming, I read what it gives me, I test it, but a lot of the AI terms are wonky. AI clearly doesn't "know" what I'm saying or even what it spits back at me.
I suspect like all things marketing and hype, once the hype train leaves the station with a bunch of terms nobody wants to be the guy who says "uh no we're not changing the world, we're doing that we're doing something useful within the limits of the technology".
The question is how they get a reasonable return on half a trillion economic activity when the main products are... well what they are. If the real cost of running this ship is actually passed on rather than heavily subsidized, who will bother to use it?
It must suck when an intentionally vague marketing term that you make up gets used in ways that don't give you the marketing benefit you were shooting for.
I enjoy using it for programming, I read what it gives me, I test it, but a lot of the AI terms are wonky. AI clearly doesn't "know" what I'm saying or even what it spits back at me.
I suspect like all things marketing and hype, once the hype train leaves the station with a bunch of terms nobody wants to be the guy who says "uh no we're not changing the world, we're doing that we're doing something useful within the limits of the technology".
- Super Intelligence: an AI that is as smart as the smartest person on Earth.
- AGI: an AI that is smarter than the entire population.
Are these definitions still stand? Based on the article and the release of GPT-5, I don't think so.