Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

An LLM is a tool. If the tool is not supposed to do something yet does something anyway, then the tool is broken. Radically different from, say, a soldier not following an illegal order, because soldier being a human possesses free will and agency.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: