CUPERTINO, Calif. — Apple has announced that it will “pause” testing a new tool, announced last month, that would scan images on users’ devices in search of supposed CSAM (Child Sexual Abuse Material) and send reports directly to law enforcement.
Apple said the implementation delay would be “in order to gather more feedback and make improvements,” CNN Business reported.
As XBIZ reported in early August, Apple’s invasive surveillance initiative was first revealed to the Financial Times by academics who had been briefed about it.
The Financial Times also reported that Apple’s move was “raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices.”
Apple’s proposed system, neuralMatch, would “proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified,” the FT reported.
The Financial Times confirmed that neuralMatch was planned to “initially roll out only in the U.S.”
Security researchers told the publication that while they are supportive of efforts to combat CSAM, they are nevertheless “concerned that Apple risks enabling governments around the world to seek access to their citizens’ personal data, potentially far beyond its original intent.”
Today Apple confirmed the project was being put on hold.
"Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of [CSAM]," the company said through a statement. "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."
Last week, the Electronic Frontier Foundation (EFF) began organizing an action to convince Apple to reconsider incorporating the new surveillance technology.
EFF explained that Apple had “abandoned its once-famous commitment to security and privacy. The next version of iOS will contain software that scans users’ photos and messages. Under pressure from U.S. law enforcement, Apple has put a backdoor into their encryption system.”
The Woodhull Freedom Foundation — a national organization dedicated to affirming and protecting sexual freedom as a fundamental human right — also issued a call to action via Twitter, encouraging supporters to join EFF’s campaign.