Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don’t keep child porn on my phone (or any where else. I also don’t keep smallpox in my freezer or nuclear weapons on my basement. Noninvasive scans for these things probably make the world a better place in some ways.

The part I object to having my life disrupted by having my personal accounts unilaterally deleted by a fucking buggy bot. Our phones are too important to us to just have them shut off without notice. That’s bullshit.



>Noninvasive scans

Scans on a local device feel quite invasive to me. Phones have become an extension of our person, let's not kid ourselves.


Is a picture of a naked kid going to trigger this algorithm?

In a few places taking pictures of your child naked while swimming is considered child pornography. Other places having children run around naked on the beach is the norm. (I have a few pictures of me, at 3y/o, naked on the beach)

In the world of remote medicine, can a parent take pictures of their naked child to send to a doctor?

How are they going to fit their cultural specifics to the world?

Knowing how Facebook dealt with it - they are going to apply the strictest rules, so no naked child photos are allowed on your iPhone anymore. For no reason.


Apple has said multiple times that accounts are not shut off with any automated system, that portion of the process is handled by humans.


Who do I call (and how do I call them) if the person agrees with the automated false positive and disables my account/phone and reports me to the police? Just being accused of having CSAM is ruinous. What's my recourse if there's an innocent bug in their system that reports me to the police?

I was at Apple for more than a decade and a half. I saw an untold number of Michael Bolton bugs [0]. It's one thing when a bug causes a dropped frame in a video or a menu takes a tenth of a second too long to appear. It's another when it ruins your life and bankrupts you defending yourself.

[0] https://www.youtube.com/watch?v=NnPBSy5FsOc&t=02m24s


Good! Because I trust those even less. If possible, once banned, no one to answer on any of the communication channels, Google style.


So... They will have humans looking at pictures to determine if it's child pornography? Well... That's a dream job for a pedo




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: