After Apple unveiled its plans to add a controversial security update, many critics voiced opposition. Now it seems even Apple employees are standing against this feature and are showing their concern. A report by Reuters states that many Apple employees expressed worries about the technology being potentially misused. All of this raises questions about Apple’s commitment to privacy.
The Apple security feature in question scans user’s photos for CSAM (child sexual abuse material) content. Apple’s motive with this security feature was to help law enforcement catch predators. However, many of their own employees and organizations like the EFF and CDT have raised concerns. These organizations have even released an open letter of protest to Apple, demanding a plan suspension.
They fear that repressive governments could exploit the feature and look for other material for censorship or arrests. An Apple employee stated that past security changes have also prompted concern, but this time “the volume and duration of the new debate is surprising.” However, most of the criticism comes from employees outside of lead security and privacy roles. This is a notable shift from the company’s secretive nature around new products.
Potential risks to Apple Security
Apple uses a self-learning AI called “NeuralHash” to identify images directly from the user’s phone. It introduces a backdoor that threatens the fundamental privacy protections for all users of Apple products. This could even provide a blueprint for breaking secure end-to-end encryption.
Back in 2016, Apple had refused the FBI’s request to make a backdoor into Apple products to monitor potential terrorists. The company has even stated that it will continue to refuse such demands in the future. But once the pandora’s box is open, there is no turning back. With this move, Apple is destroying its own security and history of well-marketed privacy decisions.
They have even dropped the plan to encrypt iCloud backups and agreed to China’s demands to store user data within the country. What happens when the markets such as China demand a backdoor into user’s systems. Legally speaking U.S. government can’t scan household equipment for contraband or make others do so. However, Apple is doing so voluntarily.