Apple Wiki
Advertisement
Apple Wiki
Apple CSAM detection process

Diagram of Apple's CSAM detection process.

NeuralHash is a hashing algorithm that was to be implemented by Apple on hardware running iOS 15, iPadOS 15, macOS Monterey, or watchOS 8. It is designed to detect the presence of Child Sexual Abuse Material (CSAM). On September 3, 2021, Apple stated that it would postpone the launch "over the coming months to collect input and make improvements".[1]

Functionality[]

NeuralHash collision test image

This image was used in a NeuralHash test to generate an artificial false positive.

The algorithm does not record image data, but generates a numerical hash from content that is uploaded to iCloud. The hashes are compared against a database from the National Center for Missing and Exploited Children (NCMEC). Flagged matches are manually reviewed and the user is provided an opportunity to appeal false positives. Apple does not store hashes that are not flagged, and previously estimated a one in a trillion chance per year of a false positive. Users confirmed to be in possession of a threshold of CSAM would have their iCloud account disabled and then reported to NCMEC and law enforcement.[2]

Sample image files from ImageNet have been artificially manipulated to generate false positive "collisions". Apple reportedly found 3 false positive collisions in a test of 100 million images.[3]

History[]

Before Apple announced the feature's implementation, iOS 14.3 was released in December 2020 with an early version of the algorithm, which was tolerant to image resizing, compression and watermarks, but not cropping and rotation.[3][4]

Reception[]

The feature was praised by lawmakers and child safety advocates, but privacy and cybersecurity experts criticized Apple's initiative.[5] Security experts claimed that a modified implementation of the algorithm could be used by oppressive governments to stifle political opposition.[6] In an interview with The Wall Street Journal, Apple senior VP Craig Federighi admitted that the messaging about the feature had been "jumbled pretty badly" and stated that the NCMEC hash database is actually stored on the device and would be uniformly limited to the same CSAM detection across territories where it is made available. Federighi said that outside auditors would be allowed to verify the process.[7]

References[]

External links[]

Advertisement