Is Apple’s New Child Safety Feature A Privacy Threat?

Privacy and security experts are skeptical.

Share on twitter
Tweet
Share on whatsapp
WhatsApp
Share on facebook
Share
security, internet, cyber-4390569.jpg

Apple looks all set to launch iOS 15 and iPadOS 15. One of its many new features will be scanning iCloud images in the U.S.

According to Apple, this new security feature will look out for known Child Sexual Abuse Material (CSAM) and protect children from sexual predators. The company will use a self-learning AI called “NeuralHash” to identify these types of images. However, many experts have raised their doubts over Apple’s new child safety feature.

This information was first confirmed in a tweet thread by Matthew Green, a cryptography professor at Johns Hopkins University. He also warned that the technology could also be used to monitor encrypted messages. Mr. Green and many security experts now believe this technology can be used as a backdoor by many U.S. agencies. Adding these types of systems to smartphones has been a major ask from law enforcement worldwide. This kind of technology is great for identifying predators but let’s not forget the case of Edward Snowden.

How does Apple’s Child Safety Feature Work?

ios 15 apple child security feature

Companies like Microsoft, Twitter, Facebook, and Google have been using image hashing to look out for CSAM for quite some years. The way Apple has implemented NeuralHash is that it works directly in the user’s device. It converts the photos into a unique string of letters and numbers, which creates a “hash.” The NeuralHash AI cross-references these hashes to known hashes of child abuse imagery.

It doesn’t need to connect to the cloud as it is self-learning and identifies illegal images using similar hashes. Apple only decrypts these images when a user passes a certain threshold of child abuse imagery in their iCloud Photos. The company then takes these results for manual verification. After verification, Apple disables the user account and reports the imagery to law enforcement.

Another way Apple ensures child safety is by discouraging them from sending or viewing sensitive photos. Apple’s new security feature even notifies parents if they decide to view or send the image themselves. It gives them several warnings such as:

  • This could be sensitive to view. Are you sure?
  • Sensitive photos and videos show the private body parts that you cover with bathing suits.
  • It’s your choice but your parents want to know you are safe.
  • If you decide to view this, your parents will be notified to make sure you are OK.

This feature is great when trying to find child pornography. But the implication of this technology in the wrong hands can be disastrous. With these iOS 15 updates, Apple can get even more data from users via these hashes despite end-to-end encryption. Facebook is also going in a similar direction by using homomorphic encryption to analyze encrypted messages.

Nalin Rawat

Nalin Rawat

Just a big nerd for everything pop culture and geeky. Pretty much in love with movies, comics, games, and awesome new gadgets.

Fossbytes Explains

Scroll to Top