AI / Artificial intelligence generic

Source: Tara Winstead / Pixels

The BFI has outlined nine key recommendations for the UK screen sector in the face of rapidly evolving AI technology, in a report published today (June 9).

The aim of the report, titled ’AI in the screen sector: perspectives and paths forward’, is to outline how the UK creative sectors can thrive in the age of AI, and make sure the UK is a global leader in creative technology.

Recommendations include developing a robust licensing framework to address copyright concerns surrounding generative AI; future-proofing the creative workforce with more formal AI training; providing transparent disclosures to audiences when AI has been used in screen content; offering targeted financial support for the UK’s creative technology sector; and investing in accessible tools, training, and funding for independent creators, through the development of ethical AI products.

Scroll down to read the recommendations

Current industry adoptions of AI include the Charismatic consortium, backed by Channel 4 and Aardman Animations, which aims to create an AI prototype and publish research into how AI could support under-represented content creators and established producers to enhance storytelling in film and television.

The BBC is piloting structured AI initiatives and the BFI National Archive and the BBFC are experimenting with AI for subtitling, metadata generation, and content classification.

“AI has long been an established part of the screen sector’s creative toolkit, most recently seen in the post-production of the Oscar-winning The Brutalist,” said Rishi Coupland, the BFI’s director of research and innovation and co-author of the report, “and its rapid advancement is attracting multi-million investments in technology innovator applications. However, our report comes at a critical time and shows how generative AI presents an inflection point for the sector and, as a sector, we need to act quickly on a number of key strategic fronts.

“Whilst it offers significant opportunities for the screen sector such as speeding up production workflows, democratising content creation and empowering new voices, it could also erode traditional business models, displace skilled workers, and undermine public trust in screen content.”

The report is published by the BFI as part of its role within the CoSTAR Foresight Lab. CoSTAR is the UK’s first national lab for creative industries’ research and development, funded by the government-backed UK Research and Innovation’s Infrastructure Fund.

It is authored by Angus Finney, Brian Tarran and Coupland and draws on published reports and research, responses to public consultations, surveys of screen sector organisations and creative technologists, and interviews with key stakeholders.

The recomendations from the report are below.

AI in the Screen Sector - recommendations

1. Set the UK in a position as a world-leading IP licensing market

“There is an urgent need to address copyright concerns surrounding generative AI. The current training paradigm – where AI models are developed using copyrighted material without permission – poses a direct threat to the economic foundations of the UK screen sector. A viable path forward is through licensing frameworks: 79 licensing deals for AI training were signed globally between March 2023 and February 2025; the UK’s Copyright Licensing Agency is developing a generative AI training licence to facilitate market-based solutions; and companies such as Human Native are enabling deals between rightsholders and AI developers.

“The UK is well-positioned to lead in this space, thanks to its ‘gold standard’ copyright regime, a vibrant creative technology ecosystem, and a coalition of creative organisations advocating for fair licensing practices. For this market to be effective, new standards and technologies are required, as outlined in a May 2025 CoSTAR National Lab report. By formalising IP licensing for AI training and fostering partnerships between rightsholders and AI developers, the UK can protect creative value, incentivise innovation, and establish itself as a hub for ethical and commercially viable AI-supported content production.”

2. Embed data-driven guidelines to minimise carbon impact of AI

“Generative AI models, particularly large-scale ones, demand significant computational resources, resulting in high energy consumption and associated carbon emissions. Yet the environmental footprint of AI is often obscured from end users in the creative industries. Transparency is a critical first step to addressing AI’s environmental impact. UK-based organisations such as Blue Zoo are already choosing to run AI models on infrastructure where energy sources and consumption are fully visible.

“These practices, combined with calls for regulatory frameworks akin to appliance energy labels, demonstrate a need for sustainability-focused AI guidelines. With the screen sector in the vanguard of generative AI uses globally, it is ideally positioned to push the demand for carbon minimisation, and the UK screen sector should lead by example.”

3. Responsible AI: Support cross-discipline collaboration to deliver market-preferred, ethical AI products

“Generative AI tools must align with both industry needs and public values. Many models, tools and platforms have been developed without sufficient input from the screen sector (or, indeed, screen audiences), leading to functionality and outputs that are poorly suited to production workflows or that risk cultural homogenisation and ethical oversights. (Use of large language models trained predominantly on US data may marginalise local narratives, for example.) Academics have called for ‘inclusive’ approaches to AI development, arguing that generative AI’s full potential can only be reached if creative professionals participate in its development.

“The feasibility of cross-disciplinary collaboration is demonstrated by Genario – a screenwriting tool created in France by a scriptwriter and an AI engineer. Embedding collaborative, inclusive design processes can enhance the relevance of AI tools to creative tasks, as demonstrated by Microsoft’s Muse experiment. These processes also ensure that AI models reflect ethical standards and cultural diversity. The UK should look to combine its strengths in AI and humanities research, and its reputation for merging technology and culture, to deliver responsible, ethical AI.”

4. Enable UK creative industry strategies through world-class intelligence

“The UK has over 13,000 creative technology companies and a strong foundation in both AI research and creative production. However, across the UK screen sector, organisations, teams and individuals – especially SMEs [small and medium-sized enterprises] and freelancers – lack access to structured intelligence on AI trends, risks, and opportunities. This absence of shared infrastructure for horizon scanning, knowledge exchange, and alignment limits the sector’s ability to respond cohesively to disruption.


The BFI has proposed creating an ‘AI observatory’ and ‘tech demonstrator hub’ to address this urgent challenge, and the proposal has been endorsed by the House of Commons Culture, Media and Sport Committee as a way to centralise insights from academia, industry, and government, and provide hands-on experience of emerging tools and capabilities.”

5. Develop the sector to build skills complementary to AI

“AI automation may, in time, lower demand for certain digital content creation skills. It may also create new opportunities for roles that require human oversight, creative direction, and technical fluency in AI systems. Our research identifies a critical shortfall in AI training provision: AI education in the UK screen sector is currently more ‘informal’ than ‘formal’, and many workers – particularly freelancers – lack access to resources that would support them to develop skills complementary to AI.

”However, the UK is well-positioned to lead in AI upskilling due to its strong base of AI research institutions, a globally respected creative workforce, and a blending of technology and storytelling expertise. By helping workers transition into AI-augmented roles, the UK can future-proof its creative workforce and maintain its competitive edge in the global screen economy.”

6. Public transparency: drive increased public understanding of AI use in screen content

“Transparency will drive audience trust in the age of generative AI. Surveys reveal that 86% of British respondents support clear disclosures when AI is used in media production, and this demand for transparency is echoed by screen sector stakeholders, who call for standards on content provenance and authenticity to counter the rise of AI-generated misinformation and ‘slop’.

”National institutions such as the BBC are already experimenting with fine-tuning AI models to reflect their editorial standards, and the BFI is deploying AI in archival work with a focus on ethical and transparent practices. These efforts demonstrate the UK’s capacity to lead in setting audience-facing standards and educating the public about generative AI’s new and developing role in content creation.”

7. Sector adaptation: Boost the UK’s strong digital content production sector to adapt and grow

“The UK boasts a unique convergence of creative excellence and technological innovation, with a track record of integrating emerging technologies into film, TV, and video game production. London is the world’s second largest hub (after Mumbai) for VFX professionals. Generative AI is already being used across the UK screen sector to drive efficiencies, stimulate creativity, and open new storytelling possibilities – from AI-assisted animation (Where the Robots Grow) and visual dubbing (Flawless) to reactive stories and dialogue (Dead Meat).

“However, surveys identify a lack of AI training and funding opportunities, while Parliamentary committees point to fragmented infrastructure and an absence of industry-wide standards that could hinder the continued growth and development of AI-supported creative innovation. Our own roundtable discussions with the sector highlighted the need for resources to better showcase the R&D work of the sector, to support collaboration and reaching new investors.”

8. Investment: Unlock investment to propel the UK’s high-potential creative technology sector

“There is a compelling opportunity and a pressing need for targeted financial support for the UK’s creative technology sector. The UK is home to global creative technology leaders including Framestore and Disguise, as well as AI startups such as Synthesia and Stability. However, the House of Lords has identified a “technology scaleup problem” in the UK, with limited access to growth capital, poor infrastructure, and a culture of risk aversion acting as barriers to expansion.

“A Coronation Challenge report on CreaTech points to “significant” funding gaps at secondary rounds of investment (Series B+ stages) which are “often filled by international investors … creating risks of IP and talent migration out of the UK”. The report also found that physical infrastructure is needed, stating that: “Those involved in CreaTech innovation can struggle to find space to demonstrate, and sell, their work.” Commenting on a February 2025 House of Lords Communications and Digital Committee report into the scaleup challenge, inquiry chair Baroness Stowell called for action to “unravel the complex spaghetti of support schemes available for scaleups” and “simplify the help available and ensure it is set up to support our most innovative scaleups to grow”.

9. Independent creation: Empower UK creatives to develop AI-supported independent creativity

“Generative AI is lowering traditional barriers to entry in the UK screen sector – enabling individuals and small teams to realise ambitious creative visions without the need for large budgets or studio backing. UK-based director Tom Paton describes how AI breaks down barriers that have “kept so many creators on the sidelines”, while the Charismatic consortium, backed by Channel 4 and Aardman Animations, sees the potential of AI “to support creators disadvantaged through lack of access to funds or the industry to compete with better funded organisations”.

“The emergence of AI-first studios such as Wonder, which secured £2.2m in pre-seed funding, further demonstrates the viability of independent, AI-supported content creation. By investing in accessible tools, training, and funding for independent creators, and developing market-preferred, ethical AI products, the UK can foster a more inclusive and dynamic creative economy where AI enhances, rather than replaces, human imagination.”