The House of Lord Communications and Digital Committee issued a report calling on the UK government to protect creative industries from big tech on March 6.
In stressing that the government must choose between becoming a “world-leading home for responsible, licensing-based artificial intelligence development” or “drift” towards acceptance of large-scale use of unlicensed creative content by U.S.-based models, the committee argued that only the first path is in the UK’s interests.
The House of Lords believes, in fact, that it would be a “poor bet” for the UK government to allow changes to copyright that could undermine the UK’s creative industries – which contributed £124 billion (€143.58 billion) to the UK economy in 2023, compared to just £12 billion (€13.90 billion) from AI in 2024.
At the same time, the report suggests that weakening the UK’s copyright law would harm rightsholders and stall the licensing market. Instead, the Committee recommends the government develop a licensing-first regime that can support creators’ livelihoods while encouraging sustainable AI growth.
A hard line toward scraping intellectual property
The controversy over unauthorised scraping has remained at the heart of the AI race, with tools like ChatGPT, DALL-E, and Midjourney receiving criticism for being trained on the world of third-party rightsholders without compensation or consent. Such practices have led to legal action such as The New York Times’ lawsuit against OpenAI and Disney’s lawsuit against Midjourney.
From this perspective, the House of Lord’s report demonstrates an interest in protecting creative industries against scraping by frontier AI companies. It was also skeptical that introducing a commercial text and data mining exception for AI training would expand the AI sector in the country.
“Our creative industries face a clear and present danger from uncredited and unremunerated use of copyright material to train AI models. Photographers, musicians, authors and publishers are seeing their work fed into AI models, which then produce imitations that take employment and earning opportunities from the original creators,” committee chair Baroness Keeley said in the official press release.
“The government should now make clear it will not pursue a new text and data mining exception with an opt-out mechanism for training commercial AI models. Instead, it should focus on strengthening UK protections for creators, including against unauthorised digital replicates and ‘in the style of’ uses of creators’ work and identity,” Keeley continued.
Can the UK government reign in AI scraping?
The report puts some pressure on the UK to reign in AI scraping, but it is still up to the government whether it will implement these findings.
Given that Liz Kendall, Secretary of State for Science, Innovation, and Technology, recently announced a strategic partnership with Google DeepMind, it appears unlikely that the government would make a move that could risk alienating AI vendors in the region.
However, the report could increase pressure on the government to revise its policy. Mumtaz Kynaston-Pearson, principal legal counsel at global cybersecurity firm Mimecast argued that the report will intensify pressure on ministers while in conversation with 150sec.
“But I’d temper expectations of rapid legislative change. The government has so far favoured a pro-innovation, sector-led approach, prioritising voluntary principles over hard regulation,” she added.
That being said, as counsel, the Data Use and Access Act 2025 requires both an economic impact assessment and an AI copyright report by March 2026, plus a new collective licensing framework expected later this year, through which stakeholders are likely to see incremental policy shifts.
“The Lords’ intervention has moved the Overton window: outright rejection of creator protections will now be politically harder. I’d expect some substantive action within 12-24 months though primary legislation remains some way off,” Kynaston-Pearson told 150sec.
Initial reactions to the report
The initial response to the House of Lords report has been mixed between supporters of AI regulation, and critics:
“They destroy the argument that big tech should be given the country’s creative output for free, and they lay out the case for maintaining and even strengthening existing copyright law to protect creatives from exploitation,” Ed Newton-Rex, CEO of Fairly Trained, a non-profit that certifies training data for use in AI posted on X.
On the other hand, Kay Jebelli, Senior Director for Europe at the Chamber of Progress, argued that, if adopted, the report “would significantly damage the UK’s AI ambitions” and “amplifies the false narrative that technology and creativity are at odds, and that existing rights holders must be compensated by AI companies for changing industry dynamics.”
There is a natural rift between those that want to see more protections for creatives and those who want to prioritize innovation in the AI industry. In any case, the report puts some pressure on the government to allow a licensing-first approach to AI training, which has been absent in the market so far.
Featured image: Via Official Charts.