Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>A good tip for system design is to assume the human operator goes off the rails at some point and does something absurd and nonsensical.

And your assertion is that when this happens, it's because the human operator is hallucinating?



No. It's that humans will sometimes do the same thing that we've decided to (mis-)label as hallucinating when an AI does it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: