SiteFocus - Creative solutions start with ELAINE
From our analysis, we found the following most relevant:
Client-side scanning. Whats at issue is not that Apple is seeking to detect CSAM but how they are doing it. Privacy and security researchers argued that by creating a client-side scanning system a way to examine content stored on a user device, rather than on the cloudApple risked building a system that could be subverted for other purposes
Among todays major technology companies, Apple stands out for its relatively laissez-faire approach to CSAM. In 2020, Apple made a mere 265 referrals to the National Center for Missing and Exploited Children
The decision to scan for CSAM is one with profound privacy and security trade-offs, especially for a company like Apple that operates around the globe in both democratic and authoritarian states.
The software updates Apple had planned to put in place would have compared photos on any user device with iCloud Photos enabled to a hash database of known CSAM