This week, the Toronto International Film Festival (TIFF) is celebrating 50 years of films and programming.

Through its evolution, TIFF has become an industry staple for both artists and fans, and remains important as a “major launching pad for Hollywood.” It showcases a range of international and independent films to wider market success and connects filmmakers to distributors.

TIFF also plays an important international role by programming, launching and generating wider conversations — both at the festival and beyond via fan, industry, media and social media commentary — in response to emerging trends and technologies in the film industry.

By bringing together stakeholders and curating conversations, film festivals are also powerful cultural hubs that set the tone for the norms and practices of the industry. A major theme at the moment involves questions around AI.

Future of labour in film

In our work as researchers within the Creative Labour Critical Futures project at the University of Toronto, we are tracking and examining media responses to AI use across the film industry.

We’re also mapping emerging trends around policy, governance, worker organizing, creativity, discourses around venture-backed technology startups and acts of refusal when it comes to generative AI across the creative industries, in Canada and beyond.

Film festivals are trying different ways to address AI. One film festival founded and directed by American actor Justine Bateman promises no AI.

TIFF allows the use of AI-generated material in submissions but requires filmmakers to disclose how AI was used. The festival is providing a forum for AI-related conversations through a variety of panel discussions and events.

For example, on Sept. 8 at a “Visionaries” industry conference event, Andrea Scrosati of Fremantle, a major production and distribution company, spoke about Fremantle’s AI-focused Imaginae Studios.

He discussed unlocking new possibilities, noting that AI “tools will permit a new generation of talent to emerge, because they are taking away the barriers to entry.”

Yet the role of AI is a hot-button issue that all stakeholders — filmmakers, tech companies, distributors, creatives’ guilds and unions, policymakers and viewers — are struggling to negotiate.

This negotiation involves narrating and interpreting the meaning of AI in the film industry, which in turn establish new norms and practices.

The ‘ethical’ AI narrative

A key aspect of what’s being negotiated across culture industries is how the public, fans, media commentators and creative professionals understand responsible AI creation and how this intersects with legal issues around ownership, fairness issues around compensation and philosophical issues related to creativity and authenticity.

A notable participant at a TIFF industry event was the company Moonvalley, a Toronto-based AI research firm.

With the company Asteria Film Co., co-founded by American actor and writer Natasha Lyonne and entrepreneur Bryn Mooser, Moonvalley built a generative AI model called Marey, trained only, as the company notes, on “licensed, high-resolution footage.”

Asteria Film identifies itself as “an artist-led generative AI film and animation studio” which, alongside Moonvalley, “has built the first of its kind clean foundational AI model.” Some media reporting amplifies this discourse about it being “clean” and “ethical.”

Read more: AI is bad for the environment, and the problem is bigger than energy consumption

Yet, there are questions around private companies, including Moonvalley, and public transparency and accountability in terms of how AI has been trained. A Vulture story that covered a visit to Asteria’s Los Angeles studio and interview with Mooser reports the company ultimately declined to provide details about where and how exactly the company paid for and acquired data for training, citing confidentiality.

Labour concerns

Amid conversations about the potential of AI, debates were amplified this year in labour disputes, union strikes and changes to major award regulations.

In July 2024, 2,500 voice-acting members of the SAG-AFTRA union began what would become a year-long strike against 10 video game companies, including Electronic Arts and Activision. The strike outlined 25 disputes, but the primary concern was the industry’s use of AI to “replicate” or “replace” human performers.

This debate began alongside the announcement of an AI Darth Vader non-playable character in Fortnite. Trained using the voice of James Earl Jones, with approval from Jones written into his estate, players could interact with Darth Vader during gameplay.

This integration has become controversial partly because the AI Darth Vader has been recorded swearing or using homophobic slurs in conversation with players.

SAG-AFTRA members reached an agreement on July 9, 2025, noting the addition of “consent and disclosure requirements for AI digital replica use” in union contracts.

Read more: When does an actor stop, and AI begin? What The Brutalist and Emilia Pérez tell us about AI in Hollywood

Following debates about AI use in Oscar-nominated films, the Academy Awards has similarly amended qualification requirements to account for AI use and disclosure. The academy announced that “the use of generative AI will neither help, nor hinder, a film’s chances of nomination,” though it has stressed that voting members should consider the role of the human at the heart of the creative process.

As these controversies show, the role of AI in the film industry is far from decided. Instead, it is being continually negotiated by many stakeholders.

Festivals like TIFF not only provide a window into these debates, but also play an active role in shaping their direction.

This article is republished from The Conversation, a nonprofit, independent news organization bringing you facts and trustworthy analysis to help you make sense of our complex world. It was written by: Lauren Knight, University of Toronto; Daphne Rena Idiz, University of Toronto, and Rafael Grohmann, University of Toronto

Read more:

Lauren Knight receives funding from Creative Labour and Critical Futures (CLCF) project.

Daphne Rena Idiz receives funding from the Creative Labour and Critical Futures (CLCF) project.

Rafael Grohmann receives funding from Creative Labour and Critical Futures (CLCF) project and SSHRC Connection Grant (Workers Governing Digital Technologies).