Readit News logoReadit News
Posted by u/irdc 2 years ago
Ask HN: When LLMs make stuff up, call it 'confabulating', not 'hallucinating'
In humans, a hallucination is formally defined as being a sensory experience without an external stimulus. LLMs have no sensors to experience the world with (other than their text input) and (probably) don’t even have a subjective experience in the same way humans do.

A more suitable term would be confabulation, which is what humans do when due to a memory error (eg. due to Korsakoff syndrome) we produce distorted memories of oneself. This may sometimes sound very believable to outsiders; the comparison with LLMs making stuff up is rather apt!

So please call it confabulating instead of hallucinating when LLMs make stuff up.

BaculumMeumEst · 2 years ago
Similar suggestions have been made over and over, and they've never stuck. "Hallucinating" has already hit mainstream media. It's probably going to stay that way out of sheer inertia because nobody outside of a small group of nerds on hackernews is going to sit down and think about whether "hallucinating" or "confabulating" more accurately describes the nature of the error. The existing term already captures the general idea well enough.
m0rissette · 2 years ago
And here I thought us nerds ruled the world.
quantified · 2 years ago
The millenium parties were all on New Year's Eve of 2000, not New Year's Eve of 2001. Some nerds had to be worrying about Y2K date/time bugs instead of partying too.
faichai · 2 years ago
Hard disagree. I’m browsing the internet not surfing the Information Super Highway. Mainstream media needs to rely less on allegory once technology becomes mainstream.
krapp · 2 years ago
LLMs aren't mainstream yet. When they are, it will simply be common cultural knowledge that they can (lie,hallucinate,confabulate,whatever) and metaphors won't be necessary, the way the internet became mainstream once it was common knowledge not to "feed trolls" or "click on spam."

At the moment, most people still think LLMs are basically like the computers from Star Trek, rational, sentient and always correct. Like those lawyers who used ChatGPT to generate legal arguments - it didn't even occur to them that AI could fabricate data, they assumed it was just a search engine you could talk to like a person.

This is why we still need metaphors to spread cultural knowledge. To that end I think it's less important to be technically accurate than to impart an idea clearly. "Hallucinate" and "confabulate" get the same point across, but the former is more widely understood.

Even "confabulate" isn't great, since it carries the connotation of either deception or senility/mental illness. But the "confabulation" of LLMs is inherent to their design. They aren't intended to discern truth, or accuracy, but statistical similarity to a stream of language tokens.

And humans don't really do that, so we don't really have language fit to describe how LLMs operate without resorting to anthropomorphism and metaphors from human behavior.

dtech · 2 years ago
But hallucinating is not an allegory... Just like we're still using browsing.
chriswarbo · 2 years ago
> I’m browsing the internet not surfing the Information Super Highway.

Hard disagree. You're browsing the Web not exploring the Internet.

unklefolk · 2 years ago
When educating the general public about the risks and limitation of LLMs, I think "hallucinating" is a useful term - it is something people can understand and it conveys the idea of LLMs being somewhat random and unreliable in their responses. I'm not sure "confabulating" is so easily understood or accessible.
snordgren · 2 years ago
Hallucinating also gets the point across that the LLM will sometimes be 100% sensible and 100% confident in its claims, while being 100% wrong in those claims.
andyjohnson0 · 2 years ago
I agree that confabulate is more accurate and descriptive. But in general usage "halucinate" is a commonly understood word and "confabulate " is not. Using the latter runs the risk of sounding off-puttingly technical or obscure.
gwd · 2 years ago
This is exactly the problem: I'd heard the word "confabulate" before, but has a weird 19th-century feel to it, and didn't know that it was specifically about memory reproduction errors.

I think that humans actually confabulate in minor ways a lot more than we realize; just ask your parents to compare stories with their siblings about something that happened to them growing up -- in my experience you often get completely different versions of events. Given that, it might actually be a good opportunity (as the comment sibling suggests) to introduce that into the language.

mrob · 2 years ago
The standard meaning of "confabulate" is just "chat". The "memory reproduction error" is psychology jargon. For this reason, I support continued use of "hallucinate" for LLM errors.
Kye · 2 years ago
My mind immediately goes to the episode of Star Trek: TNG with Mark Twain when I hear confabulate.
baq · 2 years ago
Good opportunity to change that!
lopatin · 2 years ago
It’s too late. People already call it hallucination and confabulating sounds weird. The idea that you think you can reverse the trend with a HN post is interesting though.
baq · 2 years ago
Language changes all the time, nothing is set in stone, and HN is one of the better places where this change can start happening. Media copycats will follow.
danielbln · 2 years ago
The question is, why? Yeah, confabulation might capture the phenomeon better, but no-one knows the word, it is not commonly used, compared to hallucination. And that is across languages:

    Russian:
        Hallucination: Галлюцинация (Gallyutsinatsiya)
        Confabulation: Конфабуляция (Konfabulyatsiya)

    German:
        Hallucination: Halluzination
        Confabulation: Konfabulation

    French:
        Hallucination: Hallucination
        Confabulation: Confabulation

    Spanish:
        Hallucination: Alucinación
        Confabulation: Confabulación

    Italian:
        Hallucination: Allucinazione
        Confabulation: Confabulazione

    Portuguese:
        Hallucination: Alucinação
        Confabulation: Confabulação
If hallucination isn't a perfect fit, but everyone and their dog know roughly what it indicates in relation to LLMs then that's still better than forcing a word no-one ever uses to describe something slightly different but overall rather similar.

alpaca128 · 2 years ago
> Language changes all the time

...towards less complexity and redundancy. Scientifically accurate terms are called that because they're usually not part of everyday language. Hallucination is a much more common word and thus more self-explanatory to most people. For a random person AI confabulation sounds about as intuitive as flux capacitor.

Deleted Comment

cr3ative · 2 years ago
I don't see this as a more suitable term; I had to check the definition of "confabulate" and the first hit in the macOS dictionary is "formal; engage in conversation" and the second is "psychiatry; fabricate imaginary experiences".

The fact I had to look it up to make sure, and that it isn't the primary definition, makes this a bad alternative to a well understood word which has already become established.

gvx · 2 years ago
In terms of communicating with laypeople, I don't think this is a hill worth dying on. "Confabulating" is not a term I think that many people outside of the field of psychology are familiar with in the first place.

If you do want to use another term for laypeople, I think "bullshitting" or "BSing" would have connotations that are more relevant than "hallucinating".

tsmarsh · 2 years ago
I find the easiest way of explaining LLMs to laypeople is "Bulls*t Engine". If tuned well they're going to answer like a salesperson or internet troll: if they don't know they will BS before they don't answer. Its not hallucinating, or confabulation its BS.

That's not to say they're not useful. The ability to BS is well regarded among humans as long as you, as a consumer, have a decent BS detector. And like a good BS artist, if you stay within their area of expertise they can be really useful. Its when you ask them something that they should or almost know that they start to be full of s**.

jfengel · 2 years ago
To me, "confabulating" implies deliberately attempting to mislead. I like "bullshit" better -- bullshitting might be lying, but it can also mean simply trying your best in the attempt to please. To the degree that an LLM "wants" something, it's to give you an answer that makes you happy, even if it doesn't know the truth.

The fact that BS is also used for deliberate lying to mislead is a strike against that word. Using "bullshit" and "hallucinate" in conjunction somewhat paint a picture of the quality of the answer you get and the "motivation" used to get there.

madeofpalk · 2 years ago
The problem with describing factual errors as "hallucinating" is that they're no more or less hallucinations than when it generate correct content. The entire point of these LLMs is that it synthesises content not in the original input. It's always "hallucinating" - its just that sometimes it gets it right and sometimes it gets it wrong.