EFF Demands Apple Should Kill CSAM Scanning Once And For All

EFF demands apple abandon csam

Apple recently announced that it will delay the controversial CSAM feature. This comes after heavy criticism from security researchers, politicians, organizations, Apple employees, and popular whistleblower Edward Snowden.

One of the bigger critics is the Electronic Frontier Foundation (EFF), which is now demanding that Apple should abandon its CSAM feature completely. Previously, the EFF and many other organizations signed an open letter to Apple in protest against the security feature.

Apple’s controversial CSAM feature

Apple’s CSAM feature scans images from your iCloud Photos for illegal child pornographic images and reports them to the authorities. However, many have raised concerns over the potential for misuse of this feature. The problem is that it scans images by default if you have iCloud Photos enabled. Many security experts and privacy watchdogs, including the EFF, fear that this will create a backdoor in the iPhone and iPad ecosystem.

Following the delay, the EFF said it was “pleased that Apple is now listening to the concerns of customers, researchers, civil liberties organizations, human rights activists, LGBTQ people, youth representatives and other groups about the dangers posed by its phone scanning tools. But the company must go further than just listening, and drop its plans to put a backdoor into its encryption entirely.”

Electronic Frontier Foundation

Apple has previously released an FAQ regarding its new security feature, but that didn’t help much either. Apple’s SVP Craig Federighi said the company could’ve done a better job at communicating about CSAM. Apple defended the CSAM feature by saying its goal was to protect children.

The company was planning to release this feature with an update to iOS 15, iPadOS 15, WatchOS 8, and macOS Monterey. However, it is now unclear if Apple’s CSAM feature will ever see the light of day. This is not the first time the company has delayed a feature after a severe backlash. Previously, they delayed a feature that forces app developers to ask users if they want to be tracked.

How does Apple’s iCloud image scanning feature work?

When you upload an image to the cloud, Apple scans these images. They do this through a multi-part algorithm to see if these images contain child sexual abuse material (CSAM). Apple flags your account if the user passes the threshold of 30 CSAM (known child pornography) images. Apple will then manually review these images and report them to the proper authorities.

Similar Posts