Apple has confirmed that it is bringing the CSAM image scanning feature to its iPhone and iPad devices. But it seems that the company has been scanning iCloud Mail for these images since 2019. The revelation came after 9to5Mac noticed that Apple’s anti-fraud chief said that “we are the greatest platform for distributing child porn.”
This raised a couple of questions, such as how does he know if Apple wasn’t scanning for CSAM images. It turns out he was referring to the image matching technology used in iCloud images to scan incoming and outgoing emails. There’s also an archived Apple child safety page that mentions this image scanning.
Apple uses image-matching technology to help find and report child exploitation. Much like spam filters in emails, our systems use electronic signatures to find suspected child exploitation.Apple’s child safety page
Apple Scans Your Emails Using iCloud
9to5Mac confirmed with Apple that the company has been scanning outgoing and incoming iCloud Mail for CSAM attachments since 2019. Even more interesting is that Apple also indicated that they were doing some limited scanning of other data. However, it was only on a small scale and did not include iCloud backups.
In January 2020, Apple’s chief privacy officer said that the company uses screening technology to look for illegal images. The company says it disables accounts if Apple finds evidence of child exploitation material, although it does not specify how it discovered it. The news comes after criticism from security experts and Apple’s employees regarding the CSAM image scanning feature.
Apple has expressed that this feature is safe and won’t ever be used for other purposes. However, that still didn’t help quench people’s fears. Some organizations have even written an open letter to Tim Cook in opposition. Even Apple’s SVP, Craig Federighi, said that the company could have communicated better about the new feature.