Apple recently introduced a new photo scanning feature that scans for CSAM (Child sexual abuse material) in your iCloud photos for child safety. If you want to know how it works, here’s our comprehensive article on the same. This feature backfired on Apple because security enthusiasts and researchers are worried that this might create a backdoor on iPhones.
Apple then released an FAQ where it explained what exactly the feature helps and how it works. However, in the recent interview with the Wall Street Journal, Apple SVP Craig Federighi discussed the comments Apple got after the feature was announced last week.
Craig Frederighi on CSAM: We Could’ve Communicated Better
Federighi says that while Messages protections and CSAM scanning are very similar features, they work differently. This has caused a lot of confusion, and Apple could have done a better job communicating about the new feature.
When asked about CSAM, he said that the feature only works on iCloud photos. The same was written in the FAQ document. The feature does not scan the photos stored in your iPhone’s internal storage.
Federighi said, “We wish that this had come out a little more clearly, because we feel very positively and strongly about what we are doing, and we can see that it has been widely misunderstood.”
Apple, in the document, went out of its way and said, “CSAM is unlike any technology out there. Photo scanning technologies scan all the photos of a user, and this created privacy risks. With CSAM, the scanner scans for image hashes, compares them to the hashes of photos that it receives from NCMEC (National Center for Missing and Exploited Children), and reports users only if both the hashes and the photos match.”
What do you think about Apple’s photo scanning feature? Let us know your thoughts and opinions in the comments section below.