Out-Law News 7 min. read

‘Workable’ AI copyright solutions lacking amidst UK policy ‘reset’

Lisa Nandy and Liz Kendall

Lisa Nandy, left, and Liz Kendall, are responsible for deciding the UK’s AI copyright policy. Dan Kitwood/Getty Images


‘Workable” solutions that enable transparency over the content and data used to train AI models and allow rightsholders to opt their works out from being used for that purpose, have yet to be found, UK ministers have admitted, despite the government previously indicating a preference for those measures to be built into a package of AI-related reforms to UK copyright law.

During an evidence session before the House of Lords’ Communications and Digital Committee last week, secretaries of state Liz Kendall and Lisa Nandy said the government had been wrong to express its preference for reform. Nandy, UK culture secretary, described it as a “mistake”, while Kendal, technology secretary, said the government is “having a genuine reset moment”.

Kendall said: “We are genuinely trying to find a way forward that seeks to back and champion our incredible world-leading creative industry – the people who work in it – because it is so important, not just for our economy but for our country, but that also manages to seize the potential opportunities that AI brings, not just for jobs and growth – though that is vital – but for all the huge benefits, the positive ways that AI can be used.”

“Not everyone will get everything, that’s the nature of this, but we have to do a package, because picking out one bit without another, without a comprehensive look, that is what we have committed to… We have to have something that goes across the board here… a way forward that delivers for both sides,” she added.

The AI copyright consultation

In December 2024, the government opened a consultation on AI and copyright with a view to helping rightsholders control their content and be remunerated for its use while enabling AI developers to obtain “lawful access to high-quality data”. At the time it said it also wanted to promote greater trust and transparency between the AI industry and the creative sector, with many rightsholders concerned that their works are being used to train AI models without their knowledge and consent or fair remuneration. AI developers have rejected suggestions that their activities infringe UK copyright law.

The government sought feedback on a range of options but expressed an initial preference for reform built around the idea of a rightsholder ‘opt out’. Under that option, the existing text and data mining exception in UK copyright law would be extended to enable the mining of content for AI training purposes but this would be coupled with mechanisms to enable rightsholders to opt their content out from being used in that way. Underpinning it all would be measures requiring AI developers to be transparent about the works they train their models on, so rightsholders – either individually or collectively – could “easily reserve their rights”.

The proposals met with significant opposition from businesses – particularly those in the creative industries – and led to attempts to force the government to introduce new AI-related copyright protections into UK law last year. Those efforts failed, but by way of compromise the government committed to a statutory timetable around AI copyright reform and has since set up technical working groups involving representatives from industry on both sides to explore what is practical.

The government has until 18 March 2026 to provide a substantive update in relation to next steps towards reform. A progress report it published last month shed little light on what proposals it might bring forward, but the comments Kendall, Nandy and other government officials made to the Committee offer some clues as to the government's latest thinking.

Opt out option challenges

Regarding the proposal for an opt out, Kendall acknowledged that the government’s initial proposals were “not a terribly popular preferred option”. She said the government had “heard loud and clear” why it “has been rejected”.

Difficulties in implementing any opt out mechanism have been surfaced through the working groups, according to Oliver Ilott, interim director general for AI at the Department for Science, Innovation and Technology. He said policymakers in the EU, India and Australia are also currently considering how to overcome challenges associated with operating an opt out approach – and implied that the UK government would only make provision for an opt out mechanism in UK law if it also decided to go ahead with expanding the text and data mining exception.

“The UK doesn’t currently have an opt out approach because we don’t have a broad exemption,” Ilott said. “Opt outs are relevant where countries have gone down the route of creating an exemption and the opt out plays into that.”

“When it comes to the implementation of those, we’ve heard lots of views – one view we’ve heard expressed is that the opt out system places a regulatory burden on the rightsholder, because they have to exercise that opt out and if you are a smaller operator … that puts some burden on you to go through and make sure that [opt out] is expressed properly and this might be technical; you would need some sort of interface there to help you do that,” he said.

“The other things we’ve heard is that if the opt out process becomes extremely straightforward then everyone just withdraws their material from the market. If you went down the route of creating an exemption it would be because you wanted people to access the material for training and if everyone withdraws you’ve undermined the primary policy goal you’ve set out there,” he said.

“The other thing we have to bear in mind is … there are technical challenges in figuring out ‘does an opt out apply to this?’. For example, if I wrote and published a blog that was a book review and in the course of that I was quoting from a novel, there is nothing in my blog which creates metadata that says ‘this is my paragraph, this is the author’s paragraph, this is my paragraph coming back through again’, so if that author has exercised an opt out somewhere, how do you know that it attaches to the blog that I might then have published?” Ilott said.

Nandy, the UK’s culture secretary, added the challenges around opt out “hadn’t [been] anticipated or fully understood” by the government before it indicated its preference for such a mechanism. She said the government doesn’t “currently know how to surmount” the challenges with it.

“It doesn’t mean that they are insurmountable, but we don’t currently have the answers to those, so at the moment we don’t have a workable opt-out proposal on the table,” Nandy said, adding that the government views it as the role of industry “to find technical solutions to some of these challenges”.

Issues around enabling transparency too

Problems in finding practical solutions to enable transparency over the use of copyrighted material in AI training were also highlighted to the Committee.

Nandy said: “There are insufficient transparency tools at the moment for some of the transparency requirements that we as a government have committed to and one of the things we are doing is working with industry urging them to try and find the solutions to that.”

According to Kendall, the working groups are due to meet again in February.

Interventions over licensing?

While the government is pushing some responsibility for finding solutions onto industry, the ministers acknowledged that the government also has important roles to play in relation to reform. Nandy said a degree of consensus is emerging on where there should be a “demarcation” of roles, confirming that the government will legislate on the issue of transparency but generally avoid “intervening overly in licensing deals which industry is already coming together to reach.”

However, Nandy said the government could still intervene on licensing issues if it feels “smaller players” are being disadvantaged.

“A lot of the deals that have been done work well for the bigger players; they don’t necessarily work well for the smaller players,” she said. “We’re as concerned about them as we are about the bigger players, not least because the creative industries are an ecosystem – your Ed Sheerans start somewhere – and so we have got to make sure we are protecting those people as well. I think that is a role for government in stepping in where those deals are not necessarily serving the creative industries as a whole and making sure that we make this work for tech companies and for creatives.”

Calls for urgency acknowledged

While the government is committed to legislating around transparency, it is less clear what other interventions it will pursue. Kendall said all the issues, including around transparency, licensing, and the expansion and practicality of new copyright exceptions, must “work together”.

According to Nandy, the responses to the AI copyright consultation have taught the government that “there are very different implications for different parts of the creative industries from the different models” of reform that could be pursued. That, she said, requires the government to “take a far more nuanced approach” and “work with different parts of the creative industries to address their very serious and in some cases existential challenge that is posed by the current system let alone any changes that we might make”.

“I think we have also learned from looking at the experience of other countries … is that there is no perfect solution to this… There is no model out there in the world to follow that has got this completely right yet,” she added.

Nandy acknowledged the call from industry on both sides for urgent clarification of the law, but said the government wants to avoid ill-thought-through legislation: “We’re keen to be able to provide clarity around this as soon as possible… But … if we rush into this and we get it wrong, we could make a mess, so we are not going to rush into it – we are going to take the time to work through with the working groups, but we appreciate there is an urgency around this and we want to move as quickly as possible.”

“We want to work together to find a solution that we know can hold and legislate or regulate in order to underpin that,” she added.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.