08/08/2021

The Guardian

UPDATE: Apple to scan iPhones to detect child sexual abuse images

Apple has announced that it plans to scan iPhones for images of child sexual abuse. The new feature called neuralMatch, which is being trialled in the US, scans images before they are uploaded to the iCloud. If it finds a match, a human will be alerted to review the image, and authorities will be alerted where abuse is confirmed. Apple also plans to scan encrypted messages sexually explicit content as a child safety measure. The new measures have alarmed privacy and security professionals but have drawn praise from child protection groups. 

Additional commentary:

  • Malwarebytes provides a simple explanation of how exactly the new technology will work.
  • IAPP post an article containing reactions from security and privacy communities over Apple’s initiative to combat abuse images.
  • Privacy International claim Apple's well-intentioned plan opens the door to mass surveillance.

UPDATE: 070821

UPDATE: 090821

  • Independent writes how WhatsApp and privacy campaigners fear the system could be used to scan other types of content.
  • The Register report Apple FAQs explain that it will block governments from subverting CSAM system.
  • Financial Times reports that Apple’s move on child protection raises privacy questions, but may not be a step backwards (£).
  • Apple releases full FAQs on its Expanded Protections for Children. 
Read Full Story
Children sexual abuse, hand over face

COVID-19: FLEXIBLE, LIVE ONLINE BCS & IAPP TRAINING NOW AVAILABLE - PLEASE CONTACT FOR DETAILS