Apple has announced that it plans to scan iPhones for images of child sexual abuse. The new feature called neuralMatch, which is being trialled in the US, scans images before they are uploaded to the iCloud. If it finds a match, a human will be alerted to review the image, and authorities will be alerted where abuse is confirmed. Apple also plans to scan encrypted messages sexually explicit content as a child safety measure. The new measures have alarmed privacy and security professionals but have drawn praise from child protection groups.
- Malwarebytes provides a simple explanation of how exactly the new technology will work.
- IAPP post an article containing reactions from security and privacy communities over Apple’s initiative to combat abuse images.
- Privacy International claim Apple's well-intentioned plan opens the door to mass surveillance.
- The Guardian simply asks, what could go wrong?
- Independent writes how WhatsApp and privacy campaigners fear the system could be used to scan other types of content.
- The Register report Apple FAQs explain that it will block governments from subverting CSAM system.
- Financial Times reports that Apple’s move on child protection raises privacy questions, but may not be a step backwards (£).
- Apple releases full FAQs on its Expanded Protections for Children.