Out-Law News 7 min. read
Trifonov_Evgeniy/Getty Images.
04 Nov 2025, 2:33 pm
Content creators and publishers can only succeed with claims of secondary copyright infringement against AI developers in the UK if AI systems trained using their content store or reproduce their works, the High Court in London has ruled.
Clarification on the point was provided in a landmark ruling on Tuesday (205-page / 2.6MB PDF) in which the court dismissed copyright claims raised by media company Getty Images against AI developer Stability AI but upheld some claims of trade mark infringement it asserted.
The trade mark claims arose from sample testing Getty carried out which found its watermark displayed in some of the outputs produced by Stability AI’s system ‘Stable Diffusion’.
In a statement, Getty claimed the ruling as a “significant win for intellectual property owners” but called on the UK government to do more to support content creators in protecting their rights. Stability AI said it was pleased with the ruling.
Tuesday’s ruling comes nearly three years after Getty launched legal proceedings against Stability AI before the High Court. Getty’s case was centred on Stable Diffusion, which produces AI-generated images based on user prompts. It accused Stability AI of infringing its intellectual property rights – claims Stability AI refuted.
While Stability AI admitted that at least some of the content that Stable Diffusion was trained on was images from Getty’s website, Getty dropped its claims of primary copyright infringement before the High Court after it was unable to provide evidence that Stability AI had undertaken any acts of unauthorised copying in the UK. By the end of the trial, which took place in June this year, its case was predominantly focused on its claims of secondary copyright infringement.
While primary copyright infringement refers to acts such as unauthorised copying or communication to the public, the concept of secondary copyright infringement addresses secondary acts, such as importing, marketing, selling and distributing. Getty’s secondary copyright infringement claims were brought under sections 22, 23(a) and 23(b) of the Copyright, Designs and Patents Act 1988 (CDPA).
Under section 22, copyright infringement is said to occur where a person, without a requisite licence, imports into the UK, otherwise than for their private and domestic use, an article which is, and which they know or have reason to believe is, an infringing copy of a copyright work.
Sections 23(a) and 23(b) provide that copyright infringement occurs where a person, without a requisite licence, either possesses in the course of a business, or sells or lets for hire, or offers or exposes for sale or hire, an article which is, and which they know or have reason to believe is, an infringing copy of the work.
To succeed with its secondary copyright infringement claims, Getty needed to convince the High Court of three things: that: Stable Diffusion was either imported into the UK or possessed, sold, let, hired, or offered or exposed for sale or hire, by Stability AI; that Stable Diffusion is both an ‘article’ and an ‘infringing copy’; and that Stability AI knew or had reasons to believe that Stable Diffusion was an infringing copy. In her ruling, judge Mrs Justice Joanna Smith DBE only part-accepted arguments that Getty advanced in this regard.
Regarding how the law on secondary copyright infringement under the CDPA should be interpreted, the judge did side with Getty on what the concept of ‘article’ should be understood to cover, rejecting Stability AI’s arguments that the concept only covers tangible objects. She said that “an article, which must be an infringing copy, is capable of being an electronic copy stored in intangible form”.
In theory, this finding means it is possible for AI systems or other software to be targeted by claims of secondary copyright infringement in the UK. However, the judge said such claims will only succeed if those ‘articles’ themselves constitute ‘an infringing copy’. She determined that this was not the case with Stable Diffusion.
Intellectual property law expert Gill Dennis of Pinsent Masons said: “The judge ruled that although an intangible electronic copy can be an ‘article’ for the purposes of secondary copyright infringement by importation into the UK, which would encompass an AI tool, it will not also be an ‘infringing copy’ unless the AI model itself has at some point contained a copy – permanent or transient – of the copyright works used to train it. The fact that its development involved the reproduction of copyright works does not alter that conclusion.”
AI, data and copyright expert Cerys Wyn Davies of Pinsent Masons added: “In this case, as there was no evidence of copying in the UK for the purposes of the training of the AI model, the copying for the purposes of training did not amount to copyright infringement in the UK. Therefore, following this decision, if an AI developer uses a training process which does not involve the tool itself storing or reproducing the data on which it was trained, then the developer will be able to circumvent copyright protection in the UK unless the copying for the purposes of the training takes place in the UK. This does not preclude copyright infringement actions in other jurisdictions where copying has taken place for the purposes of training, pursuant to international copyright conventions.”
Getty said it will be taking forward findings of fact from the UK ruling in the separate US case it has raised against Stability AI. However, it called on the UK government to take action to help content creators protect their works.
“Beyond the specifics of the decision, we remain deeply concerned that even well-resourced companies such as Getty Images face significant challenges in protecting their creative works given the lack of transparency requirements,” Getty Images said. “We invested millions of pounds to reach this point with only one provider that we need to continue to pursue in another venue. We urge governments, including the UK, to establish stronger transparency rules which are essential to prevent costly legal battles and to allow creators to protect their rights.”
The UK government consulted on possible changes to UK copyright law for the AI age last winter amidst polarising views over the use of copyright works in the training of AI models.
Representatives from across the creative industries have expressed concern with what they see as a lack of transparency over the use of their works in the training of AI models and about a lack of fair remuneration in that respect. For their part, AI developers reject assertions that their activities are infringing and want the government to loosen, not tighten, restrictions on access to data and cite the potential of AI to deliver improved economic, social, health and environmental outcomes as the prize on offer for supporting AI development.
The government’s preferred option for reform, stated in its consultation paper, would make it easier for AI developers to use copyrighted works in AI training. However, it was met with significant pushback. Some peers in the House of Lords, concerned that the government was not acting quickly enough to address the harm they perceive to be arising from the market status quo, sought to add new AI-related copyright protections to the Data (Use and Access) Bill, which was passing through parliament earlier this year. The government resisted those pressures but was forced into making some concessions around a timeline for action on progressing the development of its AI copyright policy.
The government has since established working groups to help identify practical solutions on matters of transparency and control of rights but a substantive progress report from the government is not anticipated until nearer Christmas. Previous government attempts to achieve a voluntary industry consensus on AI copyright issues failed.
Dennis said: “Following this ruling, the ball is now firmly in the government’s court to develop clear policy as soon as possible.”
A spokesperson for the UK government told Pinsent Masons: "We are aware of the judgment in this case. We recognise how important decisions on artificial intelligence and copyright are to individuals and businesses across the creative industries and AI sector, and are committed to developing an approach that allows both to thrive."
The UK government is not formulating its policy on AI and copyright in a vacuum. While it faces calls to protect the value the creative industries bring, not just to the economy but to the UK’s social and cultural fabric, it is under considerable pressure to adopt an AI-innovation-friendly approach to UK law and regulation – not only because it sees this as a route to economic growth too, but because of political pressure to retain close ties with the Trump administration in the US.
Other policymakers globally have also been grappling with the question of how best to balance the interests of content creators and AI developers – including in the EU.
Frankfurt-based AI and copyright expert Dr Nils Rauer of Pinsent Masons said: “The High Court’s decision is also of interest from a European perspective. EU copyright law and UK copyright law differ to some extent – for example, the definition of an ‘article’ is of paramount importance under UK law and the considerations of the judge therefore formed a substantial part of this decision, but this would not be the case from a European point of view. However, what is highly instructive across borders is the impact that the High Court’s reasoning has on the duty of care and the respective liability of operators of generative AI models.”
“In the light of today’s judgment, there is no primary liability of the provider as regards the output but what remains open is the question whether existing consideration regarding the liability of host providers could be applied to establish any secondary liability. To avoid such liability, providers of AI tools should apply proportionate and appropriately designed mechanisms preventing unlawful content being used throughout the training phase,” he said.
In a statement, Stability AI’s general counsel, Christian Dowell, said: “Getty’s decision to voluntarily dismiss most of its copyright claims at the conclusion of trial testimony left only a subset of claims before the court, and this final ruling ultimately resolves the copyright concerns that were the core issue. We are grateful for the time and effort the court has put forth to resolve the important questions in this case.”
Editor’s Note, 04/11/2025: This article has been updated to include the statement from the UK government.