News
Apple Kills Its Plan to Scan Your Photos for CSAM. Here’s What’s Next

Published
2 months agoon
By
New Yorker
In August 2021, Apple announced a plan to scan photos users stored in iCloud for child sexual abuse material (CSAM). The scheme was meant to be privacy-preserving and allow the company to flag potentially problematic and abusive content without revealing anything else. But the initiative was controversial and soon drew widespread criticism from privacy and security researchers and digital rights groups for the potential that the surveillance capability could itself be abused to undermine the privacy and security of iCloud users around the world. At the beginning of September 2021, Apple said it would pause the rollout of the feature to “collect input and make improvements before releasing these critically important child safety features.” In other words, a launch was still coming. Now, though, the company says that in response to the feedback and guidance it received, the CSAM detection tool for iCloud photos is dead.
Instead, Apple told WIRED this week, it is focusing its anti-CSAM efforts and investments on its “Communication Safety” features that the company also initially announced in August 2021 and launched last December. Parents and caregivers can opt into the protections through family iCloud accounts. They work in Siri, Apple’s Spotlight search, and Safari Search to warn if someone is looking at or searching for child sexual abuse materials and provide resources on the spot to report the content and seek help. Additionally, the core of the protection is Communication Safety for Messages, which caregivers can set up to provide a warning and resources to children if they receive or attempt to send photos that contain nudity. The goal is to stop child exploitation before it happens or becomes entrenched and reduce the creation of new CSAM.
“After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021,” the company told WIRED in a statement. “We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.”
Apple’s CSAM update comes alongside its announcement today that the company is vastly expanding its end-to-end encryption offerings for iCloud, including adding the protection for backups and photos stored on the cloud service. Child-safety experts and technologists working to combat CSAM have often opposed broader deployment of end-to-end encryption because it makes user data inaccessible to tech companies, making it more difficult for them to scan and flag CSAM. Law enforcement agencies around the world have similarly cited the dire problem of child sexual abuse in opposing the use and expansion of end-to-end encryption, though many of these agencies have historically been hostile toward end-to-end encryption in general because it can make some investigations more challenging. Research has consistently shown, though, that end-to-end encryption is a vital safety tool for protecting human rights and that the downsides of its implementation do not outweigh its benefits.
Communication Safety for Messages is opt-in and analyzes image attachments users send and receive on the user’s device to determine if a photo contains nudity. The feature is designed so Apple never gets access to the messages, the end-to-end encryption Messages offers is never broken, and Apple doesn’t even learn that a device has detected nudity.
The company told WIRED that while it is not ready to announce a specific timeline for expanding its Communication Safety features, the company is working on adding the ability to detect nudity in videos sent through Messages when the protection is enabled. The company also plans to expand the offering beyond Messages to its other communication applications. And ultimately, the goal is to make it possible for third-party developers to incorporate the Communication Safety tools into their own applications. The more the features can proliferate, Apple says, the more likely it is that children will get the information and support they need before they are exploited.
“Potential child exploitation can be interrupted before it happens by providing opt-in tools for parents to help protect their children from unsafe communications,” the company said in its statement. “Apple is dedicated to developing innovative privacy-preserving solutions to combat Child Sexual Abuse Material and protect children, while addressing the unique privacy needs of personal communications and data storage.”
Similar to other companies that have grappled publicly with working to address CSAM, including Meta, Apple told WIRED that it also plans to continue working with child safety experts to make it as easy as possible for its users to report exploitative content and situations to advocacy organizations and law enforcement.
Countering CSAM is a complicated and nuanced endeavor with extremely high stakes for kids around the world, and it’s still unknown at this point how much traction Apple’s bet on proactive intervention will get. Tech giants are walking a fine line, though, as they work to find a balance between CSAM detection and user privacy.
Source: Wired

Why ‘Velma’ Is the Internet’s New Punching Bag

Flybe collapse: From staff to refunds, what happens now?

Today in History: Space Shuttle Challenger explodes

Barclays announces closure of 15 branches meaning 100 banks to shut this year – is yours affected?

‘General Hospital’: Carly Needs to Face the Consequences

Brady: Premier League is UK’s best asset but must remain elite to keep success

John Halliday joins Ritz-Carlton Ras Al Khaimah Al Hamra Beach as director of operations

How to Deal If Food and Body Shaming Flow Freely in Your Family’s Culture

What to Know About Moderna’s RSV Vaccine for Adults, According to Experts

Wired Headphones Are Back, Baby! Here Are 7 Excellent Pairs to Buy Now

Why ‘Velma’ Is the Internet’s New Punching Bag

‘General Hospital’: Carly Needs to Face the Consequences

Trump kicks off 2024 bid with events in NH, SC

14 Great Deals on TVs, Wireless Earbuds, and Soundbars

‘The Persian Version’ Movie Review [Sundance 2023]: A Heart-Felt Mother-Daughter Story Through Time
Trending
-
Travel23 hours ago
Boom Supersonic Begins Construction on Overture Superfactory
-
News9 hours ago
Driver in California Tesla crash jailed for attempted murder
-
Lifestyle22 hours ago
Winnie Cheung’s Haunting Tale About Women Artists ‘Residency’ Debuts Trailer Ahead of Rotterdam Premiere (EXCLUSIVE)
-
Tech22 hours ago
PagerDuty CEO Quotes Martin Luther King Jr. in Layoff Email
-
Finance17 hours ago
Amazon to Enter the Crypto NFT Market with Gaming Initiative – Here’s What You Need to Know
-
News10 hours ago
Tesla just had its best week since May 2013
-
Auto22 hours ago
Report: Jay Leno’s Garage TV Series Cancelled
-
News9 hours ago
Stocks may face an inflection point in the week ahead as the Fed meets and Apple posts earnings