Apple to Expand Automated Surveillance of iPhone Message Images

Apple to Expand Automated Surveillance of iPhone Message Images

CUPERTINO, Calif. — Apple’s iOS 16 update for iPhones, expected this fall, will expand worldwide a controversial “communications security” tool that will use proprietary AI to detect nudity in text messages.

The worldwide expansion of this “message analysis” feature, currently available only in the U.S. and New Zealand, will begin in September when iOS 16 is rolled out to the general public.

Phone models prior to the iPhone 8 will not be affected; for Mac devices, the Ventura update will offer this option.

The nudity detection feature has been touted by Apple as part of its Expanded Protections for Children initiative, but privacy advocates have raised questions about the company’s overall approach to private content surveillance.

Apple describes the feature as a tool to "warn children when receiving or sending photos that contain nudity.”

The feature, Apple notes, is not enabled by default: “If parents opt in, these warnings will be turned on for the child accounts in their Family Sharing plan.”

When content identified as nudity is received, “the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. Similar protections are available if a child attempts to send photos that contain nudity. In both cases, children are given the option to message someone they trust for help if they choose.”

The AI feature bundled with the default Messages app, the company explained, “analyzes image attachments and determines if a photo contains nudity, while maintaining the end-to-end encryption of the messages. The feature is designed so that no indication of the detection of nudity ever leaves the device.”

According to Apple, the company “does not get access to the messages, and no notifications are sent to the parent or anyone else.”

In the U.S. and New Zealand, this feature is included starting with iOS 15.2, iPadOS 15.2 and macOS 12.1.

As French news outlet RTL noted today when reporting the expansion of the feature, “a similar initiative, consisting of analyzing the images hosted on the photo libraries of users’ iCloud accounts in search of possible child pornography images, had been strongly criticized before being dismissed last year.”

As XBIZ reported, in September 2021 Apple announced that it would “pause” that initiative. The feature would have scanned images on users’ devices in search of CSAM and sent reports directly to law enforcement.

Copyright © 2025 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

NYU Study Finds Age Verification Laws Don't Work

A group of university researchers has published a study whose findings suggest that age verification laws are ineffective at achieving their stated goal of preventing minors from accessing adult content.

XVideos Loses Advertiser Reporting Appeal in EU Court

Web Group Czech Republic (WGCZ), parent company of XVideos, has lost an appeal in the top EU court to be temporarily exempted from a requirement to publish a list of the site's advertisers.

2025 Pornhub Awards to Be Held May 8 in Los Angeles

The seventh annual Pornhub Awards will take place May 8 in Los Angeles.

Illinois Lawmakers Propose Decriminalizing Consensual Sex Work

Lawmakers in Illinois have introduced a bill that would completely decriminalize consensual sex work in the state.

VR Bangers Joins Pineapple Support as Supporter-Level Sponsor

VR Bangers has joined the ranks of over 60 adult businesses and organizations committing funds and resources to Pineapple Support.

Missouri House Gives Initial Approval to Age Verification Bill

The Missouri House of Representatives has given initial approval to HB 236, the state's proposed age verification law.

Proposed Australian eSafety Codes Include AV Requirement for Adult Sites

Australia’s online safety regulator, eSafety, is reviewing the final draft of proposed safety codes that include a requirement for adult sites to implement age assurance measures for Australian users.

Pineapple Support to Host 'Self-Harm' Support Group

Pineapple Support is hosting a free online support group for performers and creators who engage in self-harming behaviors.

Ofcom Q&A: Preparing for Age Verification Under the UK's Online Safety Act

In January, the U.K.’s online safety regulator, Ofcom, published its guidance on how online services that host adult content need to verify users’ ages under Ofcom’s rules.

Pineapple Support Taps Reagan Foxx as Brand Ambassador

Pineapple Support has named Reagan Foxx as its newest brand ambassador.

Show More