Apple FAQ: Our CSAM Scanner Isn’t The Evil You’re Thinking It Is

Apple CSAM FAQ

For the past week, Apple has been in the spotlight for its new upcoming child safety photo scanning feature. For those who don’t know, with this feature, Apple aims to scan the images on your iCloud and iMessage for CSAM (Child Sexual Abuse Material). However, security researchers are not too happy about it.

Edward Snowden has called it a privacy threat. Many others have called it out for the same and how it could potentially create a backdoor for accessing your data. We’ve got our own thoughts laid out in this article. Today, Apple has released a document where it has tried to explain in the form of FAQs how the feature works or what it does behind the scenes.

Apple explains CSAM detection on iPhone

According to Apple, “CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images.”

Apple is going to scan iPhones for child abuse only if iCloud Photos‌ is enabled

“This feature only impacts users who have chosen to use iCloud Photos to store their photos. It does not impact users who have not chosen to use iCloud Photos. There is no impact on any other on-device data. This feature does not apply to Messages.”

This is something that Apple had not clarified when the feature was first announced. According to the company, CSAM scanning is different from a similar Parental Control feature coming to iMessage, where the parents can see children’s activities like the images they open, etc.

“Our feature respects your privacy”

And a lot of people have this question, “Why is Apple doing this now?”

The document says, “Existing techniques as implemented by other companies scan all user photos stored in the cloud. This creates privacy risks for all users. CSAM detection in iCloud Photos provides significant privacy benefits over those techniques by preventing Apple from learning about photos unless they both match to known CSAM images and are included in an iCloud Photos account that includes a collection of known CSAM.”

Will the scanning system detect other things?

Apple says its process is designed to “prevent” this from happening. The system will only work with CSAM image hashes provided by MCMEC and other child safety organizations.

For those wondering what the heck does image hashing even means, it is a process of assigning a unique hash value (a numeric value of a fixed-length used to identify data uniquely) to an image. If an image is duplicated, it carries the same value as the original one.

The National Center for Missing and Exploited Children (NCMEC) shares these hash values with Apple. The system will scan the hash values to find the exact match in the iCloud photos. If the hashes match, Apple will conduct a human review before making a report to NCMEC. If the system flags photos that do not match the NCMEC shared photos, the user would not be reported to NCMEC.

How accurate is CSAM detection?

Apple says that the likelihood that it would incorrectly flag an account is less than one in a trillion per year, which is pretty accurate.

Apple CSAM FAQ: Is CSAM scanning a threat?

A cryptography researcher at Bar-Ilan University in Israel told Daring Fireball:

My research in cryptography has spanned more than 25 years. I initiated the applied research on privacy-preserving computation, an area of cryptography that makes it possible for multiple participants to run computations while concealing their private inputs. In particular, I pioneered research on the private set intersection (PSI).

The Apple PSI system solves a very challenging problem of detecting photos with CSAM content while keeping the contents of all non-CSAM photos encrypted and private. Photos are only analyzed on users’ devices. Each photo is accompanied by a safety voucher that includes information about the photo, protected by two layers of encryption. This information includes a NeuralHash and a visual derivative of the photo.

If the Apple cloud identifies that a user is trying to upload a significant number of photos with CSAM content, the information associated with these specific photos can be opened by the cloud. If a user uploads less than a predefined threshold number of photos containing CSAM content, then the information associated with all of the photos of this user is kept encrypted, even if some of these photos contain CSAM content. It is important to note that no information about non-CSAM content can be revealed by the Apple PSI system.

The design is accompanied by security proofs that I have evaluated and confirmed.

Apple tried to clear a lot of misunderstandings with the CSAM FAQs in the document. Still, considering what has happened previously in the tech industry when it comes to privacy, believing in the tech giants has gotten much more difficult when it comes to data privacy.

The Cupertino giant also said it would refuse any such demands from governments trying to exploit its CSAM scanning tool for other purposes.

That said, what do you think of this? Would you trust Apple regarding this, or do you still think Apple has different evil plans with it? Share your thoughts and opinions in the comments section below.

Similar Posts