NeuralHash

NeuralHash is a that was to be implemented by Apple on hardware running iOS 15, iPadOS 15, macOS Monterey, or watchOS 8. It is designed to detect the presence of Child Sexual Abuse Material (CSAM). On September 3, 2021, Apple stated that it would postpone the launch "over the coming months to collect input and make improvements".

Functionality
The algorithm does not record image data, but generates a from content that is uploaded to iCloud. The hashes are compared against a database from the (NCMEC). Flagged matches are manually reviewed and the user is provided an opportunity to appeal s. Apple does not store hashes that are not flagged, and previously estimated a one in a chance per year of a false positive. Users confirmed to be in possession of a threshold of CSAM would have their iCloud account disabled and then reported to NCMEC and law enforcement.

Sample image files from have been artificially manipulated to generate false positive "collisions". Apple reportedly found 3 false positive collisions in a test of 100 million images.

History
Before Apple announced the feature's implementation, iOS 14.3 was released in December 2020 with an early version of the algorithm, which was tolerant to image resizing, compression and watermarks, but not cropping and rotation.

Reception
The feature was praised by lawmakers and child safety advocates, but privacy experts criticized Apple's initiative. Security experts claimed that a modified implementation of the algorithm could be used by oppressive governments to stifle political opposition. In an interview with , Apple senior VP Craig Federighi admitted that the messaging about the feature had been "jumbled pretty badly" and stated that the NCMEC hash database is actually stored on the device and would be uniformly limited to the same CSAM detection across territories where it is made available. Federighi said that outside auditors would be allowed to verify the process.