Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I feel this is a good place to add something...

I recall a half decade back, there was discussion of the quit rate of employees, maybe Facebook?, due to literal mental trauma from having to look at and validate pedophile flagged images.

Understand there is pedophilia, then there's horribly violent, next level abusive pedophilia.

I used to work in a department where, adjacently, the RCMP were doing the same. They couldn't handle it, and were constantly resigning. The violence associated with some of the videos and images is what really got them.

The worst part is, the more empathetic you are, the more it hurts to work in this area.

It seems to me that without this sad and damaging problem fixed, monitoring chats won't help much.

How many good people, will we laden with trama, literally waking up screaming at night? It's why the RCMP officers were resigning.

I can't imagine being a jury member at such a case.



Because of this issue, many departments put in much stricter protocols for dealing with this kind of material. Only certain people would be exposed to classify/tag it, and these people would only hold that post of a limited period of time. The burden on those people doesn't change, but it can be diluted to mitigate it somewhat.

Its a real and sad problem, but not one that I think can be fixed with technology. To much is on the line to allow for a false positive from a hallucinating robot to destroy a person(s) life.


I read about that here: https://erinkissane.com/meta-in-myanmar-part-i-the-setup

This remains one of the best things I've found on HN.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: