Apple 'Pauses' Implementation of New Image Surveillance Tech

Apple 'Pauses' Implementation of New Image Surveillance Tech

CUPERTINO, Calif. — Apple has announced that it will “pause” testing a new tool, announced last month, that would scan images on users’ devices in search of supposed CSAM (Child Sexual Abuse Material) and send reports directly to law enforcement.

Apple said the implementation delay would be “in order to gather more feedback and make improvements,” CNN Business reported.

As XBIZ reported in early August, Apple’s invasive surveillance initiative was first revealed to the Financial Times by academics who had been briefed about it.

The Financial Times also reported that Apple’s move was “raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices.”

Apple’s proposed system, neuralMatch, would “proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified,” the FT reported.

The Financial Times confirmed that neuralMatch was planned to “initially roll out only in the U.S.”

Security researchers told the publication that while they are supportive of efforts to combat CSAM, they are nevertheless “concerned that Apple risks enabling governments around the world to seek access to their citizens’ personal data, potentially far beyond its original intent.”

Today Apple confirmed the project was being put on hold.

"Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of [CSAM]," the company said through a statement. "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

Last week, the Electronic Frontier Foundation (EFF) began organizing an action to convince Apple to reconsider incorporating the new surveillance technology.

EFF explained that Apple had “abandoned its once-famous commitment to security and privacy. The next version of iOS will contain software that scans users’ photos and messages. Under pressure from U.S. law enforcement, Apple has put a backdoor into their encryption system.”

The Woodhull Freedom Foundation — a national organization dedicated to affirming and protecting sexual freedom as a fundamental human right — also issued a call to action via Twitter, encouraging supporters to join EFF’s campaign.

Related:  

Copyright © 2025 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

NYU Study Finds Age Verification Laws Don't Work

A group of university researchers has published a study whose findings suggest that age verification laws are ineffective at achieving their stated goal of preventing minors from accessing adult content.

XVideos Loses Advertiser Reporting Appeal in EU Court

Web Group Czech Republic (WGCZ), parent company of XVideos, has lost an appeal in the top EU court to be temporarily exempted from a requirement to publish a list of the site's advertisers.

2025 Pornhub Awards to Be Held May 8 in Los Angeles

The seventh annual Pornhub Awards will take place May 8 in Los Angeles.

Illinois Lawmakers Propose Decriminalizing Consensual Sex Work

Lawmakers in Illinois have introduced a bill that would completely decriminalize consensual sex work in the state.

VR Bangers Joins Pineapple Support as Supporter-Level Sponsor

VR Bangers has joined the ranks of over 60 adult businesses and organizations committing funds and resources to Pineapple Support.

Missouri House Gives Initial Approval to Age Verification Bill

The Missouri House of Representatives has given initial approval to HB 236, the state's proposed age verification law.

Proposed Australian eSafety Codes Include AV Requirement for Adult Sites

Australia’s online safety regulator, eSafety, is reviewing the final draft of proposed safety codes that include a requirement for adult sites to implement age assurance measures for Australian users.

Pineapple Support to Host 'Self-Harm' Support Group

Pineapple Support is hosting a free online support group for performers and creators who engage in self-harming behaviors.

Ofcom Q&A: Preparing for Age Verification Under the UK's Online Safety Act

In January, the U.K.’s online safety regulator, Ofcom, published its guidance on how online services that host adult content need to verify users’ ages under Ofcom’s rules.

Pineapple Support Taps Reagan Foxx as Brand Ambassador

Pineapple Support has named Reagan Foxx as its newest brand ambassador.

Show More