ChatGPT to add age-verification to identify under-18 users
17/09/2025 | The Guardian
OpenAI has announced that it will restrict how ChatGPT responds to users it suspects are under 18. According to CEO Sam Altman, the company is building an age-prediction system that will default to an under-18 experience if a user’s age is in doubt, and may ask some users for ID to verify their age.
Altman said the company is prioritising “safety ahead of privacy and freedom for teens” and believes the privacy compromise is a “worthy tradeoff.” ChatGPT will be trained to block inappropriate content and avoid discussions about suicide or self-harm. In cases of suicidal ideation, the company will attempt to contact parents or authorities.
What is this page?
You are reading a summary article on the Privacy Newsfeed, a free resource for DPOs and other professionals with privacy or data protection responsibilities helping them stay informed of industry news all in one place. The information here is a brief snippet relating to a single piece of original content or several articles about a common topic or thread. The main contributor is listed in the top left-hand corner, just beneath the article title.
The Privacy Newsfeed monitors over 300 global publications, of which more than 6,250 summary articles have been posted to the online archive dating back to the beginning of 2020. A weekly roundup is available by email every Friday.