Comprehensive UK AI bill delayed again over copyright rules
06/03/2026 | Financial Times
The UK government has decided to delay contentious changes to copyright rules that would have allowed artificial intelligence (AI) companies to mine media content. Following a two-month consultation and significant backlash from the creative industries, ministers have abandoned proposed regulatory models and will instead gather further evidence. As a consequence, the proposed comprehensive AI bill, originally planned for 2025 and expected in the upcoming King's Speech in May, will now be pushed to 2027.
The dispute centres on whether copyrighted material should be automatically available for AI training. Tech firms, including Alphabet, have advocated for an opt-out system. However, publishers and filmmakers, such as Working Title Films co-chair Eric Fellner, have described this model as an existential threat. Creative industry protests, including the release of a silent album to highlight intellectual property concerns, previously forced a ministerial rethink.
Media executives have recently warned against granting copyright exemptions to AI companies, similar to those granted to academic researchers. Many creators prefer to negotiate individual content-licensing deals rather than be bound by government compromises.
A new House of Lords' Communications and Digital Committee report published on Friday, 6 March 2026, supports this stance. Committee Chair Baroness Keeley said: "Our creative industries face a clear and present danger from uncredited and unremunerated use of copyrighted material to train AI models. Photographers, musicians, authors and publishers are seeing their work fed into AI models which then produce imitations that take employment and earning opportunities from the original creators.
"AI may contribute to our future economic growth, but the UK creative industries create jobs and economic value now... The Government should now make clear it will not pursue a new text and data mining exception with an opt-out mechanism for training commercial AI models. Instead, it should focus on strengthening UK protections for creators, including against unauthorised digital replicas and 'in the style of' uses of creators' work and identity. The Government's task should be to create the conditions that will allow a licensing-first approach to AI training to flourish, backed by effective transparency requirements and technical standards for data provenance and labelling, so that rightsholders and developers can participate confidently in this emerging market."
£ - This main article requires a subscription.
Training Announcement: The BCS Foundation Certificate in AI examines the challenges and risks associated with AI projects, such as those related to privacy, transparency and potential biases in algorithms that could lead to unintended consequences. Explore the role of data, effective risk management strategies, compliance requirements, and ongoing governance of the AI lifecycle and become a certified AI Governance professional. Find out more.
What is this page?
You are reading a summary article on the Privacy Newsfeed, a free resource for DPOs and other professionals with privacy or data protection responsibilities helping them stay informed of industry news all in one place. The information here is a brief snippet relating to a single piece of original content or several articles about a common topic or thread. The main contributor is listed in the top left-hand corner, just beneath the article title.
The Privacy Newsfeed monitors over 300 global publications, of which more than 6,250 summary articles have been posted to the online archive dating back to the beginning of 2020. A weekly roundup is available by email every Friday.