The landmark legal battle between Getty Images and Stability AI in the English High Court marks one of the first major copyright challenges in the generative AI space. The claim raises fundamental issues concerning copyright, database rights and trademark infringement in the context of emerging technologies. This article focuses on the copyright aspects of the claim and the potential implications for two key UK industries.
The Claim
Stability AI is a developer of open-source generative AI models. It is primarily known for a large text-to-image diffusion model named Stable Diffusion, which produces photorealistic images based on textual user prompts.
In January 2023, Getty issued a claim against Stability AI in the English High Court. The proceedings included:
- A claim for copyright infringement arising from the allegation that Stable Diffusion’s training datasets included copyrighted works that were downloaded onto servers located in the United Kingdom.
- A claim for secondary infringement of the said copyrighted works, arising from the alleged importation into the UK of the pre-trained Stable Diffusion model.
- A claim that the output of Stable Diffusion in the form of synthetic images (which are accessible to users in the UK) also constitute an infringement by reproducing a substantial part of Getty’s copyrighted works.
This is not the first time proceedings have been brought against Stability AI. In January 2023, artists Sarah Andersen, Kelly McKernan and Karla Ortiz filed a copyright infringement lawsuit in the US District Court, alleging that Stability AI and others had infringed the rights of millions of artists by training AI tools on over five billion images scraped from the web without their consent. While parts of the claim were dismissed, certain aspects concerning complex copyright issues were allowed to proceed and remain ongoing.
Stability AI’s Defence
Stability AI have sought to resist the claim on several grounds:
- Stability AI deny that the Stable Diffusion model was trained in the United Kingdom. They instead state that all training computing was done in the United States, such that UK copyright law (which is territorially limited) does not apply.
- Stability AI also allege that various “claim example” images adduced by Getty in support of its infringement claim are “weak and contrived”. They assert that no specific image from its training dataset would ever be used to produce a response to text prompts; instead, the model “produces variable image outputs even for the same or similar text prompts”. It therefore cannot be the case that “any particular output can be generated from any particular prompt”.
- Stability AI assert that any resemblance to a copyrighted image or watermark replicated in AI-generated content is a result of user prompting – not the model itself. Stable Diffusion allows users to upload and manipulate images. The degree to which images are altered by the model is entirely within the control of the user. Should a user upload (and then seek to manipulate) a copyrighted image, any breach arising from this would be the responsibility of the user – not Stability AI.
Trial on liability issues began on 9 June 2025. This is set to focus on Getty-owned copyright issues only[1]. Over the course of the trial, the Court is due to consider a number of significant factual and legal issues, including the legal implications of the data-scraping exercise allegedly conducted by Stability AI and Stability AI’s jurisdictional defences.
The Implications
AI remains a controversial topic. Globally, legal action against the developers of AI-powered tools continues to mount. Landmark claims in the US, Germany, the Netherlands, France and Hungary are yet to be resolved. In the UK, policymakers face pressure from creative industries to legislate for an ‘opt-in’ regime governing the use of works for AI model training. Lawmakers are therefore required to navigate a careful balance between protecting the rights of creatives and fostering innovation in a key growth industry.
For further information on incoming UK legislative changes, please see my colleague, James Fox’s article on the Data (Use and Access) Bill and my article on the Artificial Intelligence (Regulation) Bill.
The Getty v Stability AI trial marks a defining juncture in the copyright law for generative AI. With the first phase probing territoriality, software definition, and evidence sufficiency, its outcome could recalibrate the balance between AI innovation and creator protection. While this will undoubtedly have significant implications in the UK, there is every chance that it will also have global influence. Businesses leveraging AI must therefore stay alert: infrastructure changes and licensing frameworks may be forthcoming, and non-compliance with such regimes may carry significant legal and reputational risk.
A judgment favourable to Getty may include guidance to developers on the production of compliant training sets, which may lead to greater transparency as to what data has been used to train a generative AI model and how such training was undertaken. The consequences may also extend to corporate users, who may be required to audit their AI usage and take steps to mitigate infringement risks. While this will inevitably provide greater protection and negotiating power to creative communities, such regulation would naturally make the development of generative AI models more expensive – and may lead to developers pursuing opportunities in less regulated jurisdictions.
The dismissal of the claim would also have significant ramifications. At its most extreme, this may be perceived as an endorsement for the advancement of AI-powered technologies at any cost. It is more likely, however, that the Court will take a measured approach, which balances the rights and needs of two important industries. The UK is a global leader in the development of emerging AI technologies, but such innovation cannot come at any cost.
Regardless of the outcome, the judgment is likely to have a significant impact on how UK copyright law adapts to evolving AI technologies – and will be closely watched by stakeholders across both the creative and technology sectors.
[1] Additional claims brought by the Sixth Claimant are due to be considered at a later date.
Discuss with Eugenia and our Commercial experts how we can future-proof your business by protecting your intellectual property here.