Liam Budd, Mathilde Pavis

Source: Parliament.tv

Liam Budd, Mathilde Pavis

The UK must learn from successes and failings of the US and European Union (EU)’s stances on AI to position itself as a global leader on the technology while protecting its creative industries. 

That was the verdict of a panel of experts speaking to the UK parliament’s culture, media and sport (CMS) committee inquiry into British film and high-end TV, back for a second iteration after a UK general election-induced hiatus

Benjamin Field, founder and executive producer at Deep Fusion Films, a production company with a focus on using AI technology, said he believed there is huge export possibility for film and TV shows from the UK that have utilised AI in an ethical way – using software that is trained with legal and licensed data.

In the wake of the US actors’ and writers’ strikes of last year, he felt the studios were crippled with nerves over directly embracing AI technologies.

“The reaction to the SAG-AFTRA strikes and the writers’ strikes means that any time a studio suggests using AI there is immediate backlash, whereas if it’s made in the UK, the guidelines we have set up through Pact and [performers’ union] Equity are more aligned with how to make everything work,” he said.

“We need to legislate to ensure that the practices we can employ are legal and ethical,” Field later continued. “That is our point of differentiation [with the US and Europe]. [If] we have laws that set out exactly what is legal, therefore we can export knowing it is legal and safe and responsible, and enhances our workforce. That is what the industry is crying out for.” 

Patchwork of legislation

Currently the UK has what lawyer Mathilde Pavis termed a “patchwork of legislation” that doesn’t speak to each other very well and is not robust enough. Scraping vocal likeness, for instance, isn’t legislated against, nor is protection against unauthorised digital imitations of people.

The law on digital imitation is outdated, harking back to 1988 in the UK. “We thought the most a performer could face is a tribute act, or soundalike or lookalike, not a big economic or moral threat,” said Pavis. “Now, technology has changed that. When you can be imitated on scale, those are different economic and moral threats.

“We would expect the UK to have a system where your digital self and your physical self are equally protected, especially now that our digital lives are such a big part of our personal and professional work. Performers are the canary in the coalmine on that particular point.”

Liam Budd, recorded media industrial officer at UK performers’ union Equity, noted the test case of Guernsey, a self-governing British crown dependency, where it is possible for an individual to register their own image as a property right, capable of protection under legislation.

Pavis believed an end to ‘buy-out’ clauses in contracts, in which performers and creatives can relinquish rights to their digital self or intellectual property without even realising, is paramount.

The European Union’s dependence on ‘opt-out’ clauses was also panned by the experts, in which copyright owners must proactively opt-out of their work being used for training AI models. “They quite literally do not work… Opt out is an illusion for rights holders,” said Ed Newton-Rex, CEO of Fairly Trained, a non-profit that certifies generative AI companies that have used ethical data.

In the absence of a rigorous legal framework, Field said he is in “very early discussions” with Bafta about implementing a certification scheme, similar to the sustainable production Bafta albert certification which is now mandatory for all new commissions and recommissions of TV broadcast content.

Field noted that the albert calculator was, for a few years, a voluntary measure for productions: “The worry with AI is the damage that could be done by setting a precedent, if we roll it out in the same scale. It means we need to legislate fast.” 

Without legal protections, industrial strikes akin to those in the US in 2023 cannot be ruled out. “Creators are organising. There is a large and growing backlash to the widescale intellectual property theft that’s happening in the generative AI industry,” said Newton-Rex.

“The vast majority of performers are very pessimistic about AI at the moment, given all the intellectual property theft that is going on,” added Budd.

Audience awareness 

Benjamin field, Nick Lynes

Source: Parliament.tv

Benjamin field, Nick Lynes

Whether it should be a legal requirement for productions to disclose to audiences the use of AI in a production is not clear cut.

“I lean towards yes, but I think the industry will work that out for itself,” said Nick Lynes, co-founder and CEO of Flawless, a London and LA-based AI visual effects and post-production company. “People who are doing things ethically, I think those brands will start to represent trust marks, and people will want to disclose it.”

However, he added that AI has been used in production “for decades” and it is important audiences understand this “isn’t entirely new”.

Field anticipates AI disclosures will take a similar path to the ‘dramatic reconstruction’ label that used to be added to scenes re-enacting crimes on factual and news shows such as the BBC’s Crimewatch. These labels are now seen as outdated, as there’s an understanding that audiences have developed implicit awareness it’s not the actual footage.

Tax incentives

Lynes feels that the UK film and high-end TV tax incentives will be crucial to giving the UK an advantage with AI.

“We’re going to be looking at a serious augmentation of the filmmaking process over the next few years, it’s going to happen very quickly,” noted Lynes. “We’re going to start seeing parallelisation of process. It’s been a linear process traditionally. We’re going to see a lot of tools coming in. The confusion between what is production and what is post-production – it’s already a completely blurred line.

“The opportunity is for us to start to understand what that new filmmaking process is, and be able to start to capture more elements of this new way of making film within that [tax] rebate. I think that’s a great opportunity, that will draw more filmmakers to the UK.”

He says that the visual effects rebate confirmed in the autumn budget (UK VFX costs on film and high-end TV productions will receive a 5% increase in tax relief, for an overall net rate of 29.25%, and includes use of generative AI) is “built around linear filmmaking”. He would like to see additional incentives rolled in, such as for visual dubbing, which means generating lip motions of an actor to synchronise with given audio, providing a potentially cheaper way for films and TV shows to be dubbed into multiple languages.

“If you can get that wrapped into your UK rebate for example, it means you’re going to be drawing a load more film productions over here. You get the whole production, just because you’ve bolted on a couple of extra things.”

Lynes doesn’t feel that the correct use of AI will shrink the industry. “[It’s] bringing costs down, which will increase the amount of production overall.”