Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Goalpost movement alert. The claim was that "AI can be told not to output something". It cannot. It can be told to not output something sometimes, and that might stick, sometimes. This is true. Original statement is not.


If you insist on maximum pedantry, an AI can be told not to output something as this claim says nothing about how the AI responds to this command.


You are correct and you win. I concede. You outpedanted me. Upvoted




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: