Getty Images vs. Stability AI: implications for UK copyright and licensing law

Updated 4 months ago on June 27, 2024

Stability AI argues that the output images are pastiche because they are artworks "rendered in a style that may mimic the style of another work, artist, or period, or composed of a set of materials that mimic many elements from a large number of diverse sources of instructional material."

It also argues that "the act of creating such an image has purposes that include pastiche" and that the creation of such images by users "is, or at least is likely to be, a bona fide transaction."

He claims a fair dealing because "the scope of any taking of the copyrighted work is no greater than is necessary to create the pastiche at issue and likely much less than the entire work; the nature of the transaction is such that it is highly unlikely, rare, and the result of stochastic processes; the pastiche is not a substitute for the original copyrighted work; and the pastiche does not interfere with the market for the original copyrighted work."

Not only in the UK, but also in the EU, where the exception became mandatory for Member States to apply to user-generated content on online content sharing services as part of the EU copyright reforms in 2019, there is little guidance as to what falls within the pastiche exception. Thus, the High Court's consideration of Stability AI's arguments in favor of the pastiche exception may provide useful guidance on the scope of the pastiche exception in general and its application, if any, to the outputs of generative artificial intelligence systems in particular.

An analogy can be drawn between the arguments that Stability AI makes in relation to the pastiche exception to copyright in the UK, and the extent to which the "fair use" limitation on copyright infringement in the US can be invoked if the outputs of generative AI systems "mimic" copyrighted content input into those systems.

The issue may be addressed by US courts in a case brought by the New York Times (NYT) against OpenAI and Microsoft over their generative AI systems. The NYT accused OpenAI and Microsoft of seeking to "freeload" on "huge investments in journalism" by using published content to "create substitute products without authorization or payment." It said the output of Open AI and Microsoft's artificial intelligence systems "compete with and largely mimic the raw data used to train them," which it claimed included copies of the NYT's work, and that this did not constitute a "fair use" of those works under US copyright law.

The importance of the place where the training and development activities took place

Perhaps the potentially broader implications of this case relate to the part of Stability AI's defense that deals with claims relating to the training and development of Stable Diffusion.

Stability AI's primary defense is that it did not engage in some of the acts complained of in connection with the early development of the image generation models that were the precursors to Stable Diffusion, and that where it provided processing and hosting services to support the research and development of such models, those services were provided using hardware and computing resources located outside the UK. In addition, the company alleges that Stability AI employees located in the UK were not involved in the development.

The crux of Stability AI's arguments is that the acts complained of took place outside the scope of UK copyright law.

"None of the individuals involved in the development and training of Stable Diffusion ... did not reside or work in the UK at any material time during its development and training," Stability AI states. "The development work involved in designing and coding the software for Stable Diffusion and creating the codebase for its training was conducted ... outside the UK. The training of each iteration of Stable Diffusion was conducted ... outside of the UK. No visual assets or associated signatures were downloaded or stored (on servers or local devices) in the UK during this process."

The "location issue" was raised by High Court Judge Joanna Smith during Stability AI's unsuccessful application for exclusion last year.

At the time, the judge said that on the basis of the limited evidence she had assessed on the issue, "it is reasonable to conclude that, on the balance of probabilities, there was no development or training of Stable Diffusion in the United Kingdom." However, she said she was not sufficiently convinced of this to make a decision without giving Getty the opportunity to rebut this evidence at trial, adding, in particular, that there was some evidence that pointed away from what Stability AI had asserted and that there were reasonable grounds to believe that disclosure of the case could add to or alter the evidence of where Stable Diffusion's training and development had taken place.

"The question of location is certainly not one on which I can say that [Getty's] suit is doomed to fail," said Mrs. Justice Joanna Smith.

If the High Court agrees with Stability AI's position, Getty's claims relating to training and development will fail by virtue of the jurisdictional reach of the Copyright, Designs and Patents Act 1988 - even if Getty succeeds in convincing the court of Stability AI's unauthorized copying or reproduction of its works during the training and development phase.

It is possible that a court would find such a position intolerable.

There are examples of British courts citing the need to enforce existing "bad" law and placing a duty on Parliament to reform, and the House of Lords - in its previous capacity as Britain's highest court before the creation of the Supreme Court - famously intervened to limit the scope of long-standing copyright law as it relates to design drawings in a case involving car manufacturer British Leyland Motor Corporation in 1986.

In this case, the House of Lords overturned the decisions of the lower courts which effectively gave British Leyland the ability to control the market for aftermarket repairs to the exhaust pipes of its cars on the basis of the copyright existing in the drawings for that component. It held that copyright law, as applied by the lower courts, went too far. The House of Lords ruling ultimately determined how the 1988 Act provides for copyright infringement in the context of artistic copyright in design drawings.

Potential escalation of tensions between AI developers and content creators

Regardless of what the court itself says, content creators are likely to see the decision upholding Stability AI's arguments on the location issue as a loophole in the law: they may see it as allowing AI developers to train AI models using their copyrighted works without permission, but avoiding reproach for doing so under UK copyright law simply by virtue of using technological infrastructure located in other jurisdictions to host or process - making copies, reproducing

In this case, content creators are likely to lobby hard to update the law.

The UK government is already at the center of a major lobbying battle between content creators and AI developers. This reflects its wider aims to help the creative industries grow, as set out in its 2030 vision for the creative industries, while at the same time promoting AI innovation across the economy, including through the implementation of a national AI strategy. To the extent that these initiatives touch on copyright issues, there are natural tensions that need to be addressed. This has been evident in the copyright-related initiatives that the government has been pursuing recently.

Let's get in touch!

Please feel free to send us a message through the contact form.

Drop us a line at mailrequest@nosota.com / Give us a call over skypenosota.skype