On Thursday, May 16th, over 850 filmmakers, AI startups, technologists, and media executives gathered at the Los Angeles Center Studios lot for ‘AI on the Lot,’ a one-day conference about the increasing applications of generative AI in filmmaking. Hosted by AI education non-profit Artificial Intelligence Los Angeles (AI LA), the conference featured a keynote from former Warner Bros executive Renard Jenkins, several panels, hands-on demos, and concluded with . an hour-long screening of AI films created as part of a competition in the days leading up to the event.
Returning to LA for its second year, AI on the Lot brings together thought leaders from established AI companies like NVIDIA and Dell, pioneering founders like Edward Saatchi from Simulation AI, and lauded filmmakers and showrunners, including David Slade (Black Mirror), Matt Nix (Burn Notice), and Mark Goffman (Bull). The overall focus was on the ways AI can augment filmmakers’ workflows, which might reduce costs of VFX and animation, enabling smaller teams to achieve more with less.
“As both creators and studio executives seek opportunities to better understand GenAI and how it will impact the entertainment industry, AI on the Lot is a timely opportunity to meet the artists and technologists leading this latest revolution in filmmaking,” said Todd Terrazas, Executive Director and President of AI LA.
The conference kicked off with former Warner Bros. technology executive Renard Jenkins giving a keynote address on Generative AI applications in the film industry. Jenkins is now President and CEO of I2A2 Technologies and President of the Society of Motion Picture and Television Engineers. Jenkins is a gifted orator, something he says he inherited listening to his grandfather, who was a preacher.
Jenkins addressed the fears people have, particularly in Hollywood, that humans will be removed from the process. That they will lose their jobs. “Hollywood is a dream factory.” He said. “We can enjoy the output of so many people. I see AI as a way to enhance that but some people see it as a warehouse of nightmares.” Jenkins sees AI as a tool, like Adobe illustrator. It’s to “enhance, not replace.” AI won’t save money, he said, but it will save time.
Jenkins’ perspective on the human at the center of the AI creation process was contradicted later by panelist Edward Saatchi of The Simulation, which released Showrunner a year ago. As a proof of concept, Showrunner created South Park episodes from a single prompt. Saatchi believes legitimate, original entertainment can be crowdsourced, and episodes written by AI with minimal text prompts. There could be channels of AI created content that prompts itself. Indeed, The Gold Key, which won best immersive experience at SXSW in March, was fairy tales written and illustrated by Stable Video. People were enchanted. It could be a YouTube channel or a streaming service. No one else was talking about that.
The day closed with the screening of three AI short films in the Cinema Synthetica competition. “Love at First Bite,” a “zom-com” (zombie romantic comedy) created by AI filmmakers Nem Perez, Adriana Vecchioli, and Jagger Waters. The film used real actors (Vecchioli and Waters) and a variety of AI/tech tools including Midjourney, Runway, Adobe Premiere Pro, LensGo, Magnific, Udio, Topaz Labs, and ChatGPT, as well as a script by Emmy Award winner Bernie Su.
Read the full article here