ORG calls for stricter regulation of age assurance industry
23/07/2025 | Open Rights Group
The Open Rights Group (ORG) has warned that UK citizens face significant privacy and security risks as online platforms begin implementing age verification checks, mandated by the Online Safety Act 2023 (OSA) from 25 July 2025. The checks are required to prevent platforms from sharing adult and other potentially harmful content to under-18s.
ORG highlights concerns, including the absence of a public register for age assurance providers, a lack of specific privacy and security standards for these providers beyond existing data protection obligations, and no requirement for platforms to select trusted or certified providers. James Baker, ORG Programme Manager, said that the public is being forced to hand over sensitive personal data to unregulated providers, which will increase the risk of phishing and hacking attacks.]
In a related article, ORG advocates for interoperable age assurance methods, allowing users to choose from trusted providers.
The article also highlights that current data protection enforcement is insufficient to deter companies from prioritising cost over privacy. As such, ORG is calling for a mandatory scheme to ensure age verification systems meet high data protection and security standards.
While Ofcom does not have the power to set data protection standards, ORG points out that Ofcom and the Information Commissioner's Office (ICO) can work together to develop a voluntary standard that would ensure providers comply with their privacy obligations under Section 22(3) of the OSA. ORG also claims the UK lags behind other European jurisdictions in addressing these concerns.
According to the Financial Times (£), Ofcom has said that it will monitor compliance, particularly from platforms like Facebook, Instagram, and Google-owned YouTube, which claim to ban adult content. The regulator has recommended "highly effective" age verification tools, such as bank or credit card checks, photo ID scans, and facial feature analysis. However, many social media giants, including Meta and YouTube, are opting for artificial intelligence (AI)-based machine-learning systems to "infer" user age based on behaviour. While these AI tools are not yet endorsed by Ofcom, the regulator will assess their effectiveness soon. The legislation forces platforms to either remove all adult content, create child-safe versions, or implement strict age verification methods.
Oliver Griffiths, Ofcom's group director for online safety, noted a rush to implement changes and stated that enforcement action would be swift if companies fail to meet requirements. Elon Musk's X, TikTok, and Reddit have also introduced new age assurance systems. Reddit is using facial recognition software provider Persona for UK users to view mature content, while TikTok is enhancing its age assurance technologies. Critics of the law suggest that teenagers might use VPNs to bypass the rules, but Griffiths emphasised the focus is on preventing younger children from accidentally encountering harmful content, acknowledging the system isn't "foolproof" for determined teenagers.
£ - This Financial Times article requires a subscription.
What is this page?
You are reading a summary article on the Privacy Newsfeed, a free resource for DPOs and other professionals with privacy or data protection responsibilities helping them stay informed of industry news all in one place. The information here is a brief snippet relating to a single piece of original content or several articles about a common topic or thread. The main contributor is listed in the top left-hand corner, just beneath the article title.
The Privacy Newsfeed monitors over 300 global publications, of which more than 6,250 summary articles have been posted to the online archive dating back to the beginning of 2020. A weekly roundup is available by email every Friday.