16/08/2021

Reuters

Apple alters course on CSAM scanning over privacy concerns

Following considerable objections and concerns about its plans to search iPhones for images of child sexual abuse, Apple has confirmed that it will now only scan for CSAM images flagged by clearinghouses in multiple countries. Researchers can check that the image identifiers are universal to prove that they cannot be adapted to target individuals. The company also added it would take 30 matched CSAM images before the system prompts Apple for a human review. 

Read Full Story
Apple

COVID-19: FLEXIBLE, LIVE ONLINE BCS & IAPP TRAINING NOW AVAILABLE - PLEASE CONTACT FOR DETAILS