Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What does schizophrenia have to do with it? You might be getting confused by the use of term "hallucination" in the LLM space it doesnt mean what it is conventionally used to mean.

Also look at a defention for AGI from before LLMs and they do a good job fitting the bill. Theyre not quite going to replace all humans everywhere forever, but that was never part of the definition. They match early definitions, especially multimodal models, theres no denying they match what people envisioned agi to be. You can ask fronteir models in plain language to do a task and it can break down the task, formulate a plan of attack, research if need be, sythesise knowledge and apply the solution. If that is all "just word prediction" then so are we.

Just because we understand how it is built doesnt make it any less impressive. More importantly, we simply dont understand LLMs yet. How theyre built, we know, yes, why they work so stupidly well, we do not know.

Importantly, we cant just change definitions to move goal posts because we feel uncomfortable.

Agi defenition

https://www.gartner.com/en/information-technology/glossary/a...



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: