06/09/2021

The Guardian

Apple delays plan to scan iPhone photos for child sexual abuse

Apple has announced it will delay its plans to begin scanning user images for child sexual abuse material (CSAM) following significant backlash from privacy and information security professionals. “Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material.” The news comes after Apple initially said it would tweak how the scanning system works, but it now seems these concessions were not enough. The change of heart only took a few weeks as Apple initially announced the CSAM scheme in early August

UPDATE: 080921 - In a related article, WIRED asks what Apple can do next. It’s unlikely the company can please everyone, and nobody knows how big Apple’s child abuse problem is. But its photo scanning debacle provides new opportunities to fix it.  


Read Full Story
Apple

COVID-19: FLEXIBLE, LIVE ONLINE BCS & IAPP TRAINING NOW AVAILABLE - PLEASE CONTACT FOR DETAILS