Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If we're talking about undressing, there is no aggregate. Some people want something; others want them not to have it. Simple.

What the former want is not illegal. So the fact they are a minority is irrelevent. Minorites have rights too.

If we're talking about genuine CSAM, that very different and not even limited to undressing.





> If we're talking about genuine CSAM, that very different and not even limited to undressing.

Why would you think I was talking about anything else?

Also, "subset" != "very different"

> What the former want is not illegal. So the fact they are a minority is irrelevent. Minorites have rights too.

This is newsworthy because non-consensual undressing of images of a minor, even by an AI, already passes the requisite threshold in law and by broad social agreement.

This is not a protected minority.


> Why would you think I was talking about anything else?

Because this thread shows CSAM confused with other, e.g. simple child pornography.

And even the source of the quote isn't helping. Clicking its https://www.iwf.org.uk/ "Find out why we use the term ‘child sexual abuse’ instead of ‘child pornography’." gives 403 - Forbidden: Access is denied.

Fortunately a good explanation of the difference can be found here: https://www.thorn.org/blog/ai-generated-child-sexual-abuse-t...

> This is newsworthy because non-consensual undressing of images of a minor, even by an AI

That's not the usage in question. The usage is "generate realistic pictures of undressed minors". Undressing images of real people is prohibited.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: