Apple Abandons Controversial Plans to Detect Known CSAM in iCloud Photos

In addition to making end-to-end encryption available for iCloud Photos, Apple today announced that it has abandoned its controversial plans to detect known Child Sexual Abuse Material (CSAM) stored in iCloud Photos, according to a statement shared with WIRED.

iCloud General Feature
Apple's full statement:

After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021. We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.

In August 2021, Apple announced plans for three new child safety features, including a system to detect known CSAM images stored in iCloud Photos, a Communication Safety option that blurs sexually explicit photos in the Messages app, and child exploitation resources for Siri. Communication Safety launched in the U.S. with iOS 15.2 in December 2021 and has since expanded to the U.K., Canada, Australia, and New Zealand, and the Siri resources are also available, but CSAM detection never ended up launching.

Apple initially said CSAM detection would be implemented in an update to iOS 15 and iPadOS 15 by the end of 2021, but the company ultimately postponed the feature based on "feedback from customers, advocacy groups, researchers, and others." Now, after a year of silence, Apple has abandoned the CSAM detection plans altogether.

Apple promised its CSAM detection system was "designed with user privacy in mind." The system would have performed "on-device matching using a database of known CSAM image hashes" from child safety organizations, which Apple would transform into an "unreadable set of hashes that is securely stored on users' devices."

Apple planned to report iCloud accounts with known CSAM image hashes to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies. Apple said there would be a "threshold" that would ensure "less than a one in one trillion chance per year" of an account being incorrectly flagged by the system, plus a manual review of flagged accounts by a human.

Apple's plans were criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.

Some critics argued that the feature would have created a "backdoor" into devices, which governments or law enforcement agencies could use to surveil users. Another concern was false positives, including the possibility of someone intentionally adding CSAM imagery to another person's iCloud account to get their account flagged.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

iPhone 17 Pro Dark Blue and Orange

iPhone 17 Release Date, Pre-Orders, and What to Expect

Thursday August 28, 2025 4:08 am PDT by
An iPhone 17 announcement is a dead cert for September 2025 – Apple has already sent out invites for an "Awe dropping" event on Tuesday, September 9 at the Apple Park campus in Cupertino, California. The timing follows Apple's trend of introducing new iPhone models annually in the fall. At the event, Apple is expected to unveil its new-generation iPhone 17, an all-new ultra-thin iPhone 17...
iPhone 17 Pro Iridescent Feature 2

iPhone 17 Pro Clear Case Leak Reveals Three Key Changes

Sunday August 31, 2025 1:26 pm PDT by
Apple is expected to unveil the iPhone 17 series on Tuesday, September 9, and last-minute rumors about the devices continue to surface. The latest info comes from a leaker known as Majin Bu, who has shared alleged images of Apple's Clear Case for the iPhone 17 Pro and Pro Max, or at least replicas. Image Credit: @MajinBuOfficial The images show three alleged changes compared to Apple's iP...
xiaomi apple ad india

Apple and Samsung Push Back Against Xiaomi's Bold India Ads

Friday August 29, 2025 4:54 am PDT by
Apple and Samsung have reportedly issued cease-and-desist notices to Xiaomi in India for an ad campaign that directly compares the rivals' devices to Xiaomi's products. The two companies have threatened the Chinese vendor with legal action, calling the ads "disparaging." Ads have appeared in local print media and on social media that take pot shots at the competitors' premium offerings. One...
iphone 16 pro ghost hand

iPhone 17 Pro: 5 Reasons Not to Upgrade This Year

Monday September 1, 2025 4:35 am PDT by
Apple will launch its new iPhone 17 series this month, and the iPhone 17 Pro models are expected to get a new design for the rear casing and the camera area. But more significant changes to the lineup are not expected until next year, when the iPhone 18 models arrive. If you're thinking of trading in your iPhone for this year's latest, consider the following features rumored to be coming to...
iOS 18 on iPhone Arrow Down

Apple Preparing iOS 18.7 for iPhones as iOS 26 Release Date Nears

Sunday August 31, 2025 4:35 pm PDT by
Apple is preparing to release iOS 18.7 for compatible iPhone models, according to evidence of the update in the MacRumors visitor logs. We expect iOS 18.7 to be released in September, alongside iOS 26. The update will likely include fixes for security vulnerabilities, but little else. iOS 18.7 will be one of the final updates ever released for the iPhone XS, iPhone XS Max, and iPhone XR,...

Top Rated Comments

Populus Avatar
36 months ago
This is the right decision for Apple to make, in my opinion. I’m glad they recognized that there are better ways to prevent the spread of this type of content.

I’m sincerely surprised Apple backtracked on something as big as this (and with such a big pressure from the governments).
Score: 64 Votes (Like | Disagree)
TheYayAreaLiving ?️ Avatar
36 months ago

Yeah, but now we can't catch the pedophiles
That's the Law Enforcement and Government job. Not Apple's.
Score: 63 Votes (Like | Disagree)
TheYayAreaLiving ?️ Avatar
36 months ago
Thank you, Apple. CSAM was a joke. If privacy matters in your life, it should matter to the phone your life is on”. Long Live!

MacRumors content image
Score: 44 Votes (Like | Disagree)
Realityck Avatar
36 months ago

Apple today announced that it has abandoned its plans to detect known CSAM stored in iCloud Photos, according to a statement shared with WIRED ('https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/').
Everybody should be happy CSAM is DOA
Score: 30 Votes (Like | Disagree)
aPple nErd Avatar
36 months ago
Great news but they lost all my trust with this. I will continue to use local backups on my Mac and keep my photos out of iCloud for good.
Score: 27 Votes (Like | Disagree)
aPple nErd Avatar
36 months ago

[S]After extensive consultation with experts[/S] After extensive public pressure...

Never should have succumbed to the public feedback! Perhaps a botched introduction but no one else would have done it 'right' like Apple because of the scrutiny they are under.
Truly an unhinged take
Score: 26 Votes (Like | Disagree)