Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's only an attack vector in the minds of people who haven't given it more than 10 seconds of thought.

Apple knows the sync dates of all of the photos that are uploaded. So unless someone has hacked your account and has been directly trickle feeding CSAM for years (without you noticing) then it's going to look suspicious. A big dump of lots of CSAM at one particular timestamp is a pretty easy thing to spot.

And then in this case they aren't hacking the phone but the account which means Apple is going to notice a set of photos coming from an IP address they haven't seen used from that account before.



Do you think that Apple is going to decide whether a big dump of CSAM was uploaded by that user or a hacker and act differently based on that investigation, or just send it to LEO and let them sort it out?

Seems like there could be some legal ramifications from the choice to bypass law enforcement under certain circumstances


Depends on if they think the public will buy their claim of "we just let law enforcement sort it out." If they think the public will blame them for the false accusation, they are incentivized to avoid letting it happen.


> A big dump of lots of CSAM at one particular timestamp is a pretty easy thing to spot.

Only if that system / heuristic has been built. The same could have been said about Apple’s systems for identifying bulk account hijacks, but Apple didn’t, which I suppose is the value of this story.

And companies aren’t allowed to Just inspect content once they identify CSAM. It is kryptonite for criminal liability. Companies are required to turn it over to the feds quickly and to try not to disturb metadata.

I suspect your line of thought would work given full ability to inspect (and some assumptions about what an IP change actually proves), but in practice Apple still hasn’t gotten the basics around account hijacks/fraud sorted out, so I’m hesitant to cheer them on as they try to quickly jump into the deep screaming “think of the children!”.


Are they jumping into the dark or tossing their users over the edge and listening for a splash? Or maybe a splat.


This comment assumes that Apple does a lot of heavy lifting to exonerate individuals who are found with CSAM beyond just reporting them to law enforcement.

Of course metadata could exonerate someone who is a victim in a case like this. The question is will it ever see the light of day?


The negative PR from a false accusation would be expensive. On top of the judgement itself, and you know that Apple has deep enough pockets that someone will be looking for a big score.


"A lot of heavy lifting"

Also known as a 20 line script which checks the last modified date for a bunch of recently uploaded files and validates the IP address against the recently known list.


The code to extract metadata is easy. I’m talking more about whether or not there is a deliberate process in place to actually write the code, run the checks and provide all available metadata and context to law enforcement. Apple has not indicated that process exists, thus far.


Not giving it 10 seconds of thought seems common in most HN reactions to the whole CSAM thing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: