Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The thing I always tell those who heavily trust its output is to ask it something you either already know the answer to or are something of an expert in; the flaws become much more evident.

The old joke is that you can get away with anything with a hi-vis vest and enough confidence, and LLM's pretty much work on that principle



A super heavy overconfidence of any LLM is what confuses a lot of people.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: