Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

absolutely LLMs are not clones of human brains in digital form. There are differences. I'm not denying that. But you don't have to simulate the human brain to be intelligent anymore than a plane needs to have feathers and flap wings to fly.


No, but that doesn't mean that there's not a difference between being intelligent and giving a good impression of intelligence. Subject matter experts quickly determine that LLMs are always confident sounding but often incorrect in a way that humans would not be (experts don't confidently state something they are uncertain about or which they don't know at all). I'm a believer in the strong AI hypothesis - machines/AI can and probably will be actually intelligent at some point, but LLMs are definitely not that.


There really is no difference. Either there is utility or there isn't. "Fake" intelligence that produces results is just something that does not make any sense.

LLMs may be worse but Humans confidently state something they are uncertain about all the time lol. Maybe not experts in general but then that's still comparing LLMs to a small percentage of humanity.

At any rate, Unlike many seem to think, The issue is not in fact the lack of ability to distinguish truth and fact from fiction. Turns out being able to distinguish the two and having the incentive to communicate that are 2 different things.

GPT-4 logits calibration pre RLHF - https://imgur.com/a/3gYel9r

Just Ask for Calibration: Strategies for Eliciting Calibrated Confidence Scores from Language Models Fine-Tuned with Human Feedback - https://arxiv.org/abs/2305.14975

Teaching Models to Express Their Uncertainty in Words - https://arxiv.org/abs/2205.14334

Language Models (Mostly) Know What They Know - https://arxiv.org/abs/2207.05221

The Geometry of Truth: Emergent Linear Structure in Large Language Model Representations of True/False Datasets - https://arxiv.org/abs/2310.06824


There really is a difference and I just told you what it was. You're fitting the facts to suit your believe rather than vice versa.


"Hallucinations" aren't a difference sorry given how much people do it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: