Apple delays plan to scan iPhone photos for child sexual abuse

06/09/2021 | The Guardian

Apple has announced it will delay its plans to begin scanning user images for child sexual abuse material (CSAM) following significant backlash from privacy and information security professionals. “Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material.” The news comes after Apple initially said it would tweak how the scanning system works, but it now seems these concessions were not enough. The change of heart only took a few weeks as Apple initially announced the CSAM scheme in early August

UPDATE: 080921 - In a related article, WIRED asks what Apple can do next. It’s unlikely the company can please everyone, and nobody knows how big Apple’s child abuse problem is. But its photo scanning debacle provides new opportunities to fix it.  


Read Full Story
Apple

What is this page?

You are reading a summary article on the Privacy Newsfeed, a free resource for DPOs and other professionals with privacy or data protection responsibilities helping them stay informed of industry news all in one place. The information here is a brief snippet relating to a single piece of original content or several articles about a common topic or thread. The main contributor is listed in the top left-hand corner, just beneath the article title.

The Privacy Newsfeed monitors over 300 global publications, of which more than 4,350 summary articles have been posted to the online archive dating back to the beginning of 2020. A weekly roundup is available by email every Friday.

Freevacy has been shortlisted in the Best Educator category.
The PICCASO Privacy Awards recognise the people making an outstanding contribution to this dynamic and fast-growing sector.