Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

They become the “real words” later. This is the way all trust & safety works. It’s an evolution over time. Adding some friction does improve things, but some people will always try to get around the filters. Doesn’t mean it’s simply performative or one shouldn’t try.


Why do you think that AI pretending things like suicide don't happen (and that nothing is happening in Palestine) is an improvement?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: