> If we're talking about genuine CSAM, that very different and not even limited to undressing.
Why would you think I was talking about anything else?
Also, "subset" != "very different"
> What the former want is not illegal. So the fact they are a minority is irrelevent. Minorites have rights too.
This is newsworthy because non-consensual undressing of images of a minor, even by an AI, already passes the requisite threshold in law and by broad social agreement.
> Why would you think I was talking about anything else?
Because this thread shows CSAM confused with other, e.g. simple child pornography.
And even the source of the quote isn't helping. Clicking its https://www.iwf.org.uk/ "Find out why we use the term ‘child sexual abuse’ instead of ‘child pornography’." gives 403 - Forbidden: Access is denied.
What the former want is not illegal. So the fact they are a minority is irrelevent. Minorites have rights too.
If we're talking about genuine CSAM, that very different and not even limited to undressing.