Apple

Apple's new NeuralHash technology will scan iCloud photos for child abuse content

Login To Bookmark

  • Aug 29,2024
  • 2 min read
  • 7
  • 866 Views

Apple is planning to introduce a new technology that will detect and report known child sexual abuse material to law enforcement. At the same time, the Cupertino giant promises preservation of user privacy.

The new child sexual abuse material (CSAM) detection technology, NeuralHash is aims at better protection of children who use the Apple's services from online harm. Features include blocking sexually explicit content sent and received through the child's iMessage account and even regulating CSAM search through Siri or Search.

Up until now, Apple has always refused to scan through user's files uploaded on iCloud and provides option to encrypt data before its reaches their servers. This has also led to the company facing pressure from governments and its law enforcement agencies who were not granted a backdoor access when investigationg crimes.

News of the working on NeuralHash first surfaced through a series of tweets by Matthew Green, a cryptography professor at Johns Hopkins University. 1backlash from privacy advocates and security experts followed over Apple's alleged invasion of user privacy, something which its user consider a god standard fro other companies.

Apple, however confirmed that NeuralHash instead works on a user’s device. It can identify if a user uploads any content relate to child abuse on the cloud without decrypting them until a condition is met and then passing through a series of checks before it is cleared.

Photos on a user's iPhone or Mac are converted into a unique thread of letters and numbers known as Hash. Before the imagea are uploaded on iCloud, the hashes are matched against a database of known hashes of child abuse imagery, provided by child protection organizations. NeuralHash matching utilizes a cryptographic technique called private set intersection which can detect a match without revealing the image or alerting the user.

Next, Apple uses another cryptographic principle called threshold secret sharing which proceeds to decrypt the contents only if the user crosses a threshold of known child abuse imagery in their iCloud Photos. "If a secret is split into a thousand pieces and the threshold is ten images of child abuse content, the secret can be reconstructed from any of those ten images", Apple explained.

At this point, Apple proceeds to decrypt the matching images for manual verification and reports concerned law enforcement agencies and child protection organizations. Apple has publised further technical details and working principles regarding the NeuralHash on its website.

NeuralHash is expected to roll out later this year within two or three months in iOS 15 and macOS Monterey. Apple has said that the technology would first be introduced in the U.S. but doesn't mention international launch as of now.

 

 



0 comments

Leave a reply

Please Login or Register to Comment. Get Started

Share this article