Apple drops controversial plan to scan iCloud Photos for CSAM

By

Apple drops controversial plan to scan iCloud Photos for CSAM
Images in iCloud Photos will not be scanned for child sexual abuse material.
Image: Apple

Apple completely abandoned its previously announced plan to scan iCloud Photos libraries for child sexual abuse material. The company will not go through users’ pictures on its cloud-storage servers looking for CSAM images.

Instead, Apple is going the opposite direction by enabling users to encrypt pictures stored in iCloud Photos.

Apple’s controversial CSAM plan is dead

Apple’s original plan, announced in 2021, was to use a system called neuralMatch to unearth suspected child abuse images in user photo libraries uploaded to iCloud. It also planned to employ human reviewers to verify that the material was illegal. Any CSAM images located would have been reported to relevant local authorities.

The company’s intentions were good but faced a barrage of criticism from privacy advocates, rights groups and organizations like the Electronic Frontier Foundation. Even its own employees quietly joined the backlash.

Apple put the plan on hold last December. And now it’s completely dropped it.

The Mac-maker gave a statement to Wired that says, in part:

“We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data.”

Building on this decision, Apple launched Advanced Data Protection for iCloud on Wednesday. This brings end-to-end encryption to iCloud Photos so no one but the user can access images stored there. Even Apple cannot.

The feature, which is coming in iOS 16.2 and the iPad equivalent, will also give users the option to encrypt device backups and Notes stored on iCloud, as well many other types of data.

Apple still protects children from sexting

The change does not mean Apple has given up on fighting child exploitation. Its statement to Wired also says:

“After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021.”

Using a system built into iOS, iPhones can detect if a child gets or sends sexually explicit photos through the Messages app. The user is then warned. This process happens entirely on the handset, not on a remote server. And the messages remain encrypted.

Killian Bell contributed to this article. 

Newsletters

Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.