CSAM detection
Does this mean Apple is going to scan all the photos stored on my
iPhone?
No. By design, this feature only applies to photos that the user chooses to upload to iCloud
Photos, and even then Apple only learns about accounts that are storing collections of known
CSAM images, and only the images that match to known CSAM. The system does not work for
users who have iCloud Photos disabled. This feature does not work on your private iPhone pho-
to library on the device.
Will this download CSAM images to my iPhone to compare against
my photos?
No. CSAM images are not stored on or sent to the device. Instead of actual images, Apple uses
unreadable hashes that are stored on device. These hashes are strings of numbers that repre-
sent known CSAM images, but it isn’t possible to read or convert those hashes into the CSAM
images they are based on. This set of image hashes is based on images acquired and validated
to be CSAM by at least two child safety organizations. Using new applications of cryptography,
Apple is able to use these hashes to learn only about iCloud Photos accounts that are storing
collections of photos that match to these known CSAM images, and is then only able to learn
about photos that are known CSAM, without learning about or seeing any other photos.
Why is Apple doing this now?
One of the significant challenges in this space is protecting children while also preserving the
privacy of users. With this new technology, Apple will learn about known CSAM photos being
stored in iCloud Photos where the account is storing a collection of known CSAM. Apple will not
learn anything about other data stored solely on device.
Existing techniques as implemented by other companies scan all user photos stored in the
cloud. This creates privacy risk for all users. CSAM detection in iCloud Photos provides signifi-
cant privacy benefits over those techniques by preventing Apple from learning about photos
unless they both match to known CSAM images and are included in an iCloud Photos account
that includes a collection of known CSAM.
How will CSAM detection in iCloud Photos handle photos of my kids
in the bathtub, or other innocent images that involve child nudity?
CSAM detection for iCloud Photos is designed to find matches to known CSAM images. The
system uses image hashes that are based on images acquired and validated to be CSAM by at
least two child safety organizations. It is not designed for images that contain child nudity that
are not known CSAM images.