Infringement by training the AI system
The first infringement issue AI presents relates to the significant volume of data which must be 'scraped' from the internet to train the AI system.
Copyright infringement requires the use of all or a substantial part of copyright works without authorisation from the owner.
In the context of AI, the recent Getty Images (US) v Stability AI Ltd [2023] EWHC 3090 (Ch) case involved an allegation that the data used to train Stability AI's system included Getty Images' work as Stability AI sought a reverse summary judgment and strike out of Getty's claim.
Although the case will be heard at trial later this year, the Getty Images case demonstrates the potential liability for copyright infringement for AI developers since users, with sufficiently precise prompts, could produce an image which included the Claimant's logo.
The Government has underscored the importance of ensuring AI systems adhere to ethical principles, including transparency in training data. The AI Opportunities Action Plan highlights that a trustworthy AI ecosystem depends on documenting and ethically sourcing data used in model development.
Infringement by the user
Moreover, the current position for copyright infringement by individuals and companies prompting the AI system ('users') is equally unclear.
Given that copyright infringement by users remains untested in Court, significant caution should be exercised by users who are aware of a particular artist's work and enter prompts into the AI system to create similar work in the style of a particular artist.
Similarly to the issue of authorship, and given the legal uncertainty at this stage, the documentation of all prompts entered by a user could prove vital in copyright infringement proceedings.
This aligns with the UK Government's ongoing plans to support responsible AI use.
As noted in the BBC, fostering public trust in AI will require that ethical practices are promoted and enforced through transparent regulations and practical guidance for individuals and organisations.