Apple’s new feature to scan for child exploitation content on its device

■WHAT IS THE NEW FEATURE?

APPLE has announced that software updates leater this year will bring new features that will help protect children from predators who use communication tool to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM). The features include use of new technology to limit the spread of CSAM online, especially via the Apple platform. There will be on-device-protection for children from sending or receiving sensitive content, with a mechanism to alert parents if the user is below the age of 13 years. Apple will also intervene when Siri or Search is used to look up CSAM related content.

■ WHAT IS THE TECH APPLE IS USING?

As mentioned in a blog post, Apple said it will use cryptography applications via iOS and iPadOS to match known CSAM images stored on iCloud Photo. The technology will match images on a user’s iCloud with these known images provided by Child Safety Organisations. And this can be done without actually seeing the image, and only by looking for what is like a fingerprint match. If there are matches crossing a threshold, Apple will report these instances to the National Center for Missing and Exploited Children (NCMEC).

Apple clarified that its technology keep user privacy in mind and hence the database is transformed into ‘ an unreadable set of hashes that is securely stored on user’s device’. The technology will determine a match without revealing the result.

At this point, the device creates a “cryptographic safety voucher” with the match result and other encrypted data and save it to iCloud with the image. Thresholds secret sharing technology ensures that these vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content.

Categories: News