Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Why would you think I was talking about anything else?

Because this thread shows CSAM confused with other, e.g. simple child pornography.

And even the source of the quote isn't helping. Clicking its https://www.iwf.org.uk/ "Find out why we use the term ‘child sexual abuse’ instead of ‘child pornography’." gives 403 - Forbidden: Access is denied.

Fortunately a good explanation of the difference can be found here: https://www.thorn.org/blog/ai-generated-child-sexual-abuse-t...

> This is newsworthy because non-consensual undressing of images of a minor, even by an AI

That's not the usage in question. The usage is "generate realistic pictures of undressed minors". Undressing images of real people is prohibited.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: