One of the most gruesome forms of illegally disseminated media is displaying the sexual abuse and exploitation of children. While the amount of crimes related to child sexual abuse is on the rise, we currently do not have consent on the best way how to counter this issue. In the context of IT systems, all major tech companies have implemented automatic scanning and reporting for Child Sexual Abuse Material (CSAM). However, due to the increasing use of end-to-end encryption (E2EE) the classical server-side scanning of content becomes more and more ineffective imposing a need for new detection approaches. As different stakeholders like law enforcement, policymakers, privacy advocates, and child-safety organisations have different or even opposed requirements on technical solutions, there is a need for a trade-off between the benefits of privacy and security provided by (E2E-)encryption in general and the negative implications it has on preventing and prosecuting CSAM. The currently most prominent candidate for such a balance is to put the scanning technology already used by companies in their services (on the server side) to the users’ devices (Client-side scanning, CSS). CSS allows the scanning to happen before the encryption takes place, only triggering action when harmful media is found. CSS is typically based on perceptual hashing, hence we call this approach Perceptual Hash-based Targeted Content Scanning (PHTCS). This paper makes use of the structured analytic technique ACH (Analysis of Competing Hypotheses) to assess the responsibility of the PHTCS proposal as a compromise between privacy and child safety. We aim to bring the discussion forward, however, we do not consider our recommendation the ’right’ answer or a final result in this debate. Rather, we present the discussion guided by the ACH methodology to enable a (more) unbiased discussion that can support every interested party in judging PHTCS by finding its own, documented conclusions. Furthermore, the ACH may be applied to further controversial topics beyond client-side scanning to distribute illegal content and thus supports finding a balanced, documented solution.
«One of the most gruesome forms of illegally disseminated media is displaying the sexual abuse and exploitation of children. While the amount of crimes related to child sexual abuse is on the rise, we currently do not have consent on the best way how to counter this issue. In the context of IT systems, all major tech companies have implemented automatic scanning and reporting for Child Sexual Abuse Material (CSAM). However, due to the increasing use of end-to-end encryption (E2EE) the classical s...
»