NeuralHash

NeuralHash is a to be implemented by Apple in 2021 on hardware running iOS 15, iPadOS 15, macOS Monterey, or watchOS 8. It is designed to detect the presence of Child Sexual Abuse Material (CSAM).

Functionality
The algorithm does not record image data, but generates a from content that is uploaded to iCloud. The hashes are compared against a database from the (NCMEC). Flagged matches are manually reviewed and the user is provided an opportunity to appeal s. Apple does not store hashes that are not flagged, and estimates a one in a chance per year of a false positive. Users confirmed to be in possession of a threshold of CSAM will have their iCloud account disabled and then reported to NCMEC and law enforcement.

History
Before Apple announced the feature's implementation, iOS 14.3 was released in December 2020 with an early version of the algorithm, which was tolerant to image resizing and compression, but not copying and rotation.

Reception
The feature was praised by lawmakers and child safety advocates, but privacy experts criticized Apple's initiative. Security experts claimed that a modified implementation of the algorithm could be used by oppressive governments to stifle political opposition. In an interview with , Apple senior VP Craig Federighi admitted that the messaging about the feature had been "jumbled pretty badly" and stated that the NCMEC hash database is actually stored on the device and would be uniformly limited to the same CSAM detection across territories where it is made available. Federighi said that outside auditors would be allowed to verify the process.