The scope of the Online Safety Act 2023 (OSA), which received Royal Assent in the Houses of Parliament last week, is likely to require more than 100,000 online services to comply with the new law. In an analysis provided by Pinsent Masons, Ofcom, the regulator in charge of enforcing the new regime, has stated that different safety measures will be appropriate for different types of service, and its recommendations will depend on the size and degree of risk of the service.
In a new paper, the regulator will also publish a series of draft codes of practice and guidance related to the 'illegal harms' duties on 9 November as part of the first phase of its work to implement the new rules. The regulator also plans to consult on the draft codes and guidance.
The second phase concerning child safety, pornography and the protection of women and girls will include the publication of draft guidance on age assurance in December, with further consultations in the new year.
The third phase, relating to transparency, user empowerment, and other duties on categorised services, is only likely to affect a "small proportion of regulated services." Ofcom says it will issue a call for evidence next year concerning its approach, followed by a further consultation on draft transparency guidance in mid-2024.
What is this page?
You are reading a summary article on the Privacy Newsfeed, a free resource for DPOs and other professionals with privacy or data protection responsibilities helping them stay informed of industry news all in one place. The information here is a brief snippet relating to a single piece of original content or several articles about a common topic or thread. The main contributor is listed in the top left-hand corner, just beneath the article title.
The Privacy Newsfeed monitors over 300 global publications, of which more than 4,350 summary articles have been posted to the online archive dating back to the beginning of 2020. A weekly roundup is available by email every Friday.