Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

While I agree that we need a word for this type of behavior, hallucinate is a wrong choice IMO.

Hallucinations are already associated with a type of behavior, which is (roughly defined) "subjectively seeing/hearing things which aren't there". This is an input-level error, not the right umbrella term for the majority of errors happening with LLMs, many if which are at output-level.

I don't know what would be a better term, but we should distinguish between different semantic errors, such as:

- confabulating, i.e., recalling distorted or misinterpreted memories;

- lying, i.e., intentionally misrepresenting an event or memory;

- bullshitting, i.e., presenting a version without regard for the truth or provenance; etc.

I'm sure someone already made a better taxonomy, and hallucination is OK for normal public discussions, but I'm not sure why the distinctions aren't made in supposedly more serious works.



I mean, I think you're right that confabulation is probably a more correct technical term, but we all use hallucinate now, so it doesn't really matter. It might have been useful to argue about it 4 or 5 years ago, but that ship has long since sailed. [1]

And I think we already distinguish between types of errors -- LLM's effectively don't lie, AFAIK, unless you're asking them to engage in role-play or something. They mostly either hallucinate/confabulate in terms of inventing knowledge they don't have, or they just make "mistakes" e.g. in arithmetic, or in attempting to copy large amounts of code verbatim.

And when you're interested in mistakes, you're generally interested in a specific category of mistakes, like arithmetic, or logic, or copying mistakes, and we refer to them as such -- arithmetic errors, logic errors, etc.

So I don't think hallucination is taking away from any kind of specificity. To the contrary, it is providing specificity, because we don't call arithmetic errors hallucinations. And we use the word hallucination precisely to distinguish it from these run-of-the-mill mistakes.

[1] https://trends.google.com/explore?q=hallucination&date=all&g...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: