The creative industries stand at the heart of the UK’s economic future, contributing an impressive £125 billion to the economy and providing employment for over 2.3 million people. Equally transformative is artificial intelligence - both as an enabler of other industries, including the creative industries, and as a sector in its own right.

The rapid development and adoption of generative AI tools has significant economic implications for the creative industries, with the potential for such tools to affect revenue streams, job opportunities and the overall market dynamics within the creative sector.

“Both the creative industries and AI sectors are at the heart of our industrial strategy, and they are also increasingly interlinked. AI is already being used across the creative industries, from music and film production to publishing, architecture and design; it has transformed post-production, for instance. As of September 2024, more than 38% of creative industries businesses said that they have used AI technologies, with nearly 50% using AI to improve their business operations.” – Chris Bryant, The Minister for Creative Industries, Arts and Tourism 

How does copyright law impact on AI training?

A particularly contentious issue has been the question of whether the developers of generative AI tools should be free to use vast datasets of copyright protected material when training their models, and copyright owners find themselves at the forefront of a legal battleground that may reshape the intellectual property law landscape. In December, the UK government launched a consultation to examine the tension between copyright law and generative AI. The consultation explores how existing copyright frameworks can adapt to the rapid advancements in AI technology, ensuring that the creative industries remain protected while fostering innovation. 

What does the consultation say about AI training?

A particular focus on the consultation is the disputed application of UK copyright law to the training of AI models. The Government has signalled that it intends to change the law to permit AI developers to use online copyright protected content to train their models, except where a copyright owner has ‘opted out’ of having their material used through an agreed mechanism. The kind of computational analysis of content used by AI developers to train their models is called ‘text and data mining’ (TDM). Under existing UK copyright law, TDM on third-party copyright protected content is only permitted where licenced or when undertaken for a non-commercial research purpose. The UK government’s proposed new approach permitting commercial TDM would more closely align UK law with the position in the EU, where AI developers are permitted to engage in commercial TDM on content that has been made publicly available online, unless rightsholders have ‘opted-out’ through express machine-readable rights reservations.

The ‘opt out’ approach adopted in the EU is controversial, and there remains significant uncertainty on what kinds of ‘opt out’ rights reservations by copyright owners will be effective. There is broad consensus that AI developers should respect the ‘robots.txt’ instructions on websites, however copyright owners also want AI developers to go further and comply with other types of opts outs such as file-level metadata and website terms and conditions. 
There are however significant challenges in ensuring that AI developers have complied with rightsholder opt outs.

There is ongoing debate occurring in the EU over the level of transparency required from model developers in providing a summary about the content used for training. In its consultation, the UK Government has proposed introducing specific transparency obligations aimed at increasing trust and accountability in AI training practices. These requirements focus on ensuring that rightsholders can verify whether their ‘opt outs’ have been respected and whether their works have been used for TDM.

What are the competing views on the proposal?

Creative industry representatives have been very critical of the government’s proposals. The British Phonographic Industry, the trade association for the recorded music industry, has warned that the proposal would “severely undermine the UK music industry”, with musicians such as Sir Paul McCartney warning that “I think AI is great, and it can do lots of great things… But it shouldn't rip creative people off”. The recently launched Creative Rights in AI Coalition, a broad group of organisations from across the creative industries, has also emphasised that AI companies should pay for the high-quality copyright-protected works which are essential to train and ground accurate generative AI models. 

“Retaining the UK’s gold standard copyright protections - and ensuring the law is enforceable and respected in the face of the challenges posed by generative AI - will create incentives for generative AI developers to enter into licence agreements with rights holders, ensuring a steady flow of quality, human-authored works for GAI training. Without proper control and remuneration for creators, investment in high-quality content will fall. GAI innovation will inevitably stall, and value will drain from both the tech and creative industries which contribute so much to the UK economy and quality of life” – Creative Rights in AI Coalition

YouGov polling suggests that 72% of the public believe that AI companies should be required to pay royalties to the creators of text, audio, or video that they use to train AI models. 

In contrast, technology companies are in favour of the proposals. Tech UK, a trade association for UK technology companies argues that “current uncertainty over AI and copyright risks holding back both the development and use of AI technology”, and Google has argued that “To ensure the UK can be a competitive place to develop and train AI models in the future, they should enable TDM for both commercial and research purposes”. 

What other issues are being considered in the consultation?

The UK government consultation also sought views on whether the outputs of AI tools should be protected under copyright law. Existing UK copyright legislation purports to provide protection to works generated by a computer where there is no human author of a work, however there is uncertainty as to whether such protection would apply in practice or whether copyright protection requires human originality. Most countries, including the US and most EU member states, do not provide copyright protection to computer generated works without a human author, and the government has indicated that if the consultation does not reveal sufficient evidence of the need for the protection of AI outputs it intends to remove the provision.  

Other issues addressed by the consultation include infringement and liability relating to AI-generated content, whether there should be a requirement that AI outputs should be labelled as AI generated, and whether there should be additional protection for actors and singers against the creation of ‘digital replicas’ or deepfakes of their voice or appearance. The Government also raises questions about the increasing use of synthetic data to train AI systems and whether the application of copyright law to these use cases is sufficiently clear. 

What’s next?

The broad scope of the consultation reflects the Government's ambition to modernise copyright law and ensure it keeps pace with evolving AI technologies and practices. The consultation closed on 25 February 2025, and the Government is now considering the responses. By receiving input from a diverse range of stakeholders, the government aims to develop a balanced and forward-looking copyright framework that supports both innovation and the protection of creators' rights.

Download the report

Authors