This will be very unpopular with the HN crowd, please please bear with me.
I really like e2e-encrypted communication. I also think that in the past decades, in general, most policies have erred too far on surveillance and too little on privacy
BUT
I also think that CSAM material on e2ee channels is a real problem.
If we, as the pro-privacy tech community don't come up with solutions, we'll lose the battle, eventually.
There'll come up a case of child exploitation or trafficing, somehow related to encrypted chats, and it'll be so horrific that the public will be swayed to action, and then what are the options? Is there any option besides client-side scanning or the end of e2ee?
Maybe you should stop thinking that this is a technical problem to be solved. It is a problem with society, that these people can exist.
You could just as well ban sharp knives, so that no one can be stabbed anymore. After that you would still live in a world where people get killed and we all use butterknives to cut our bread.
Hungary is next in line for presidency and wants Chat Control pretty bad. Who says that these Scanners can only be used to look for CSAM? I’m pretty sure the Hungarian government is also interested in people sending pride flags. Let’s see how that works out…
CSAM material existed long before the internet existed and it will continue to exist no matter what laws or "technical solutions" you come up with. E2EE simply is another way sick people try to avoid detection.
Just like most people on this site would be outraged if the police or other government authorities decided they the right to search your home or person at any time without cause, so should they be outraged at the idea that governments should have the right to go through any of your communications whenever they want.
If you want to solve the problem the focus should be on figuring out why people make and share CSAM and how to get them the treatment they need and to ensure that they're not put in a position where they have the opportunity to produce this type of material.
What happened to “the law holds that it is better that 10 guilty persons escape, than that 1 innocent suffer”? For the very minority of criminals, should literally everyone suffer? When it comes to private information at the hands of power, the suffering occurs as "with two lines of a man's handwriting, an accusation could be made against the most innocent, because the business can be interpreted in such a way, that one can easily find what one wishes."[0]
My main concern is that the actual criminals will always find a way around those measures. They're more motivated than anyone to gain the technical expertise to do so. And I use the term expertise loosely here.
For example, from what I've read this proposal was about scanning images and URL's, not text (for now). But it's not hard to split up an URL in several pieces so that the regex doesn't recognise the set of strings as an URL anymore, it'll just be plaintext which won't get scanned. And it's also not hard to send pictures as base64 encoded text, which again would fall outside of the scanning scope.
In my opinion, as with many of these measures, you end up hurting the innocent, the criminals will be fine with (semi) technical workarounds, and we end up on that way too slipperty slope of mass-deployed surveillance for nothing.
Technical measures can and will be circumvented, always, by those motivated to do so. And if you're a CSAM criminal and your prospect is going to prison tagged as a child molester, I imagine your motivation is as high as it gets.
I don't think there's a technical solution to this.
Ever more child porn is generated mechanically, without any real child suffering. It is much safer for the producers to use AI instead.
Most countries in the world consider AI-generated child porn illicit, but some don't, and it would be worth rethinking the difference. Once there is no human victim whose rights were violated, a major reason for banning such images disappears, and only vague reasons remain: either that it could stimulate consumers to act in reality (which is a wobbly theory), or that it is disgusting.
I really like e2e-encrypted communication. I also think that in the past decades, in general, most policies have erred too far on surveillance and too little on privacy
BUT
I also think that CSAM material on e2ee channels is a real problem.
If we, as the pro-privacy tech community don't come up with solutions, we'll lose the battle, eventually.
There'll come up a case of child exploitation or trafficing, somehow related to encrypted chats, and it'll be so horrific that the public will be swayed to action, and then what are the options? Is there any option besides client-side scanning or the end of e2ee?