Starting later this year, Apple will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies.
Apple says the method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, Apple said the system will perform on-device matching against a database of k How to Download and Install iOS 15nown CSAM image hashes provided by the NCMEC and other child safety organizations. It will transform this database into an unreadable set of hashes that is securely stored on users' devices.