Apple explains decision to cease iCloud CSAM scanning

05/09/2023 | WIRED

Apple has responded to child safety group Heat Initiative, outlining its reasons for abandoning the development of its iCloud photo-scanning tool for detecting child sexual abuse material (CSAM) on the platform. Instead, Apple is focusing on a set of on-device tools and resources for users known collectively as Communication Safety features. The company's response to Heat Initiative offers a rare look at its broader views on creating mechanisms to circumvent user privacy protections, such as encryption in order to monitor data. Apple's director of user privacy and child safety, Erik Neuenschwander, wrote in the company's response seen by WIRED that scanning every user's privately stored iCloud data would create new threat vectors for data thieves to find and exploit and could open the door for bulk surveillance. Heat Initiative is organising a campaign to demand that Apple "detect, report, and remove" child sexual abuse material from iCloud and offer more tools for users to report CSAM to the company.

Read Full Story
Children, unhappy face, CSAM

What is this page?

You are reading a summary article on the Privacy Newsfeed, a free resource for DPOs and other professionals with privacy or data protection responsibilities helping them stay informed of industry news all in one place. The information here is a brief snippet relating to a single piece of original content or several articles about a common topic or thread. The main contributor is listed in the top left-hand corner, just beneath the article title.

The Privacy Newsfeed monitors over 300 global publications, of which more than 4,350 summary articles have been posted to the online archive dating back to the beginning of 2020. A weekly roundup is available by email every Friday.

Freevacy has been shortlisted in the Best Educator category.
The PICCASO Privacy Awards recognise the people making an outstanding contribution to this dynamic and fast-growing sector.