I can't agree with the photorealistic AI images because they're indistinguishable from an actual photograph.
Everything else I do agree with you on, though.
The probem is, prosecutions are just looking for easier ways to jail people for things they could do based on what they personally believe. (E.g. "Manga causes child abuse")
The United States already considers artwork that resembles a real minor to be outside the First Amendment and hence illegal. Even like, cartoon artwork. If you're fapping to naked Bart Simpson that's one thing, but if it's a drawing of a real child you are using that child's image as a sexual object, that can be profoundly traumatizing, and it is seen to cross the threshold of "actually abusing a child" that justifies not applying the First Amendment. People's likenesses in general are subject to strong protection in the United States and you can face strong penalties for misusing them, even if porn is not involved; consider White v. Samsung.
You show a jury a picture of the minor and a reproduction of the image in question. It may not matter too much if the creator intended the image to resemble the actual minor; only that they intended to create a sexually explicit image of a minor, which happens to resemble an actual minor. The marginal cases won't be prosecuted.
But some (a lot of) idiots will do your hard work for you. They'll share torrents of "<insert underage celebrity> Nude.mpv" that has been AI-generated, or post an erotic drawing of their favorite streamer's young daughter to a chan board, labelling it as such. The law allows the prosecution to nail those cases to the wall.
I do find it quite interesting how people support this idea (because they got a warrant), but are vehemently against the idea of backdooring encryption.
reply