Out-Law News 5 min. read

UK AI copyright report reveals little progress

SEO Houses of Parliament London_Digital - SEOSocialEditorial image

It is likely to be March before the UK’s future AI copyright policy becomes clearer. Dan Kitwood/Getty Images.


There are no new indications as to what steps the UK government will take to update copyright law to account for the AI age, experts in intellectual property law have said following publication of a progress report.

Cerys Wyn Davies and Gill Dennis of Pinsent Masons said content creators and AI developers are likely to have to wait until mid-March for meaningful signs of the direction government policy on AI and copyright will take, but they highlighted how its progress report does provide an insight into the polarised views that exist on the issue and the challenges the government faces in achieving reform that achieves a fair balance of respective interests.

The government has until 18 March 2026 to prepare and publish both an economic impact assessment in respect of potential AI-related copyright reform and a report regarding the options for reform and its proposals on various AI and copyright issues – including matters such as measures for controlling AI developers’ access to and use of copyrighted works, transparency regarding such use, licensing, and enforcement.

Those statutory requirements arise under the Data (Use and Access) Act (DUAA), which came into force earlier this year. The timetable for action – which included publishing the progress report within six months of Royal Assent – reflects an awkward compromise; that being a government commitment to accept a statutory timetable for driving forward AI-related copyright reform in return for law makers dropping plans to insist on the DUAA providing for new AI-related copyright protections, amidst their concerns that the status quo is damaging to content creators.

Many content creators, such as publishers, authors and musicians, are concerned that AI developers are using their copyright works to train their AI models and inform the output those models produce. They want greater transparency over when AI developers seek to use their content as well as more control over whether to enable access to it – and, if so, to be remunerated in return. They argue that existing copyright protections in UK law are being ignored by AI developers and want the government to intervene.

AI developers, however, reject assertions that their activities are copyright infringing. They want the government to loosen, not tighten, restrictions on access to data and point to the potential of AI to deliver improved economic, social, health and environmental outcomes as the prize on offer for supporting AI development.

The government opened a consultation on potential AI copyright reform late last year in which it expressed a preference for an option that would, if implemented, recalibrate the existing text and data mining exception in UK copyright law to facilitate the training of AI models using copyrighted material, but only if rights holders do not opt their content out from being used in that way. Greater transparency over the use of copyrighted material to train AI is envisaged by the government to help underpin how this new regime would work.

According to the government’s new progress report, however, this option was favoured by just 3% of the more than 11,500 respondents to its AI copyright consultation. Instead, 88% of respondents said they would prefer the government to pursue a different option under which UK copyright law would require AI developers to obtain a licence in all cases to use copyrighted content for the purpose of training their AI models, an option the government previously said would likely harm the UK’s global competitiveness in AI development.

The government, which has promised to “provide a detailed summary of consultation responses on each of the options” it consulted on in its March 2026 report, said the preferences specified in response to its consultation “partly reflects” the “large response” there had been from individual creators and the creative industries.

Linked to the statutory timetable that it is obliged to follow, the government has established working groups, to provide a forum through which representatives from the technology industry and the creative industries can work together to identify “practical solutions” to policy questions pertaining to AI and copyright.

The government’s progress report revealed that three initial roundtable meetings were held in July and September and that a further four meetings – one for each of the four different expert technical working groups – were held in late November and early December, respectively.

One of the working groups is looking at tools and standards to support content creators in exercising control over when their works are used. The government said this group is also exploring the effectiveness of such measures and “whether legislation or guidance is needed to drive adoption and compliance”.

Another of the working groups is looking into “what legal transparency duties could include”, while the separate licensing group is “considering the strengths and weaknesses of the current licensing framework, especially for smaller creators and is looking at potential ways to facilitate licensing”. The fourth working group is looking into “additional protections for the creative industries”.

A separate parliamentary working group has also been formed and has had two meetings since, one in the House of Commons and one in the House of Lords. The government has said the views of the parliamentary working group and the work of the technical groups will feed into its March 2026 report.

Cerys Wyn Davies of Pinsent Masons said: “The government is trying to tread a very fine line with its attempts to balance the respective interests of the creative industries and AI developers. It has put a great deal of onus on the two different sides not only reaching a form of consensus on issues that have to-date been highly polarising, but on them finding new solutions to technical problems that might constrain how any new policy on AI and copyright in the UK is implemented in practice.”

“I understand there has been some progress made on technical solutions that would enable content creators to exercise an effective opt-out of their works from AI training, but there are remaining questions over the cost and administrative burdens associated. On licensing too, the UK’s Copyright Licensing Agency claims to have developed a ready-made solution – if the government were to insist on its use. In relation to transparency, it remains to be seen how granular any new UK duties are made. In the EU, policymakers have suggested that AI developers do not need to disclose ‘the details for the specific data and works used to train the model’ to meet copyright-related disclosure obligations arising under the EU AI Act,” she said.

Gill Dennis, also of Pinsent Masons, added: “In the meantime, the market is not staying still.”

“On the one hand, there are a growing number of licensing agreements between AI developers and the big content creators – Disney’s deal with OpenAI is a recent example of this. However, these isolated agreements do not address the wider policy questions or the interests of individual creatives. On the other hand, the courts are being asked to intervene. The failure of Getty Images’ copyright case against Stability AI before the High Court in London is not the end of the matter; there is other litigation ongoing between content creators and AI developers globally and, among other things, we expect the Court of Justice of the EU (CJEU) in due course to provide clarification on how EU copyright protections enjoyed by press publishers apply in the context of gen-AI,” she said.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.