With generative AI, all traditional production workflows will be disrupted, asserted Marco Tempest, a Directors Fellow at the MIT Media Lab, of this transformative tech and how it could be used well beyond engineering circles. “I think it’s the first time ever that we’ve been confronted with a technological wave that doesn’t require technologists to take advantage of.”
He made these remarks during a virtual keynote at the annual International Broadcasting Convention, which concluded on Monday in Amsterdam. AI, to no surprise, was the biggest topic at the event while messaging about its potential was often optimistic, but at times dicey.
Familiar Hollywood tech developers such as Avid, Adobe and Blackmagic Design came to the conference and exhibition to highlight development of AI-driven tools for creatives, but also to allay worries that it will take away jobs. In the words of Avid president and CEO Jeff Rosica the goal is “fostering creativity through the power of AI.
“AI can learn behaviors. If it consumes something; it can ultimately be taught to help,” he said, addressing what it could mean, for instance, for Avid’s core base of editors. “It will help automate processes and get rid of mundane tasks. So the editor can spend more time on creative decisions.”
Also at IBC, there were whispers about AI in the context of the ongoing WGA and SAG-AFTRA strikes. Hollywood studio representatives generally steered clear of commenting on the topic during panels, including one hosted by MovieLabs, a non-profit research & development joint venture founded by the major Hollywood studios. As the execs discussed the MovieLabs “2030 Vision” for Hollywood, they instead emphasized a call for interoperability among the different technologies used by filmmakers. MovieLabs CEO Rich Berger related that if all of the tools talk to one another, filmmakers can be “more efficient and give more time back to the creative process. … Ultimately creative teams should have a choice so they are not locked in [to a given tool or workflow].”
During the event, plenty of eyes were on Avid, not only because it is the maker of widely-used tools such as the Media Composer editing system but because last month it announced a deal to be acquired by a private equity firm STG affiliate for a whopping $1.4 billion.
At IBC, Avid introduced Ada, the company’s overall brand for its AI-driven technology. Avid already offers ScriptSync AI and Media Composer PhraseFind AI, both of which employ Avid’s AI speech-to-text tech.
Avid also developed an AI-driven search recommendation engine for its Media Central platform. “The engine will look at the written word and start recommendations for content (including video, audio or words),” Rosica explained, suggesting that this might be useful for editors, script supervisors or others, primarily for areas such as documentary, reality programming or news. The company continues to work on AI-driven tools for production and post, some of which are patented and some patent pending.
“We see AI as a co-pilot, to help a creative person do the work they are doing,” Rosica told The Hollywood Reporter of the big picture strategy, adding that in time “it could learn the way an editor visually structures a story.”
The Avid exec suggested that at that point “it could do a rough cut [for sports highlights]; it could suggest edits—where it’s formulaic. Learning a pattern and applying it to something new.” He clarified that he believes this would not be used to cut, for instance, a narrative feature. “It not going to edit it for you, it will save time.”
Meanwhile, developer Blackmagic already includes a neural engine in its Resolve color grading and postproduction software, enabling AI-driven features for editors, assistants and others, such as audio to text and face analysis. New in version 18.6, which was released at IBC, is a VFX tool in its Fusion software, for exuding 3D shapes from 2D images.
Similar to the approach of Avid, Blackmagic’s director of North American sales operations Bob Caniglia said of his company’s approach to this tech, “We have been using AI in Resolve for a few years now and intend to add additional tools in the future. We’re using it to help with tasks, but not subvert the artistic abilities of the artists.”
Adobe also continues to advance its AI-driven capabilities and at IBC it highlighted its Firefly generative AI tool that’s now available in Photoshop and Illustrator. Speaking on a panel, Adobe’s pro editorial marketing manager Kylee Pena urged the industry to “make ethics part of the conversation.”
Adobe is among the companies in a Coalition for Content Provenance and Authenticity that aims to develop standards for tech that would identify what was altered by AI, as well as create a way to mark potential source material with “do not train” credentials.
At IBC, these discussions also came from young companies such as startup Strada, which recently raised $1.9 million in pre-seed funding to build an AI-enabled cloud platform with tools for production and postproduction. Talking with THR, CEO Michael Cioni suggested that knowing the source of AI content could allow for a paid licensing model and other protections. “We are going to be asking the companies that make these [AI] models to disclose where they trained,” he told THR, meaning “you can search by ethically sourced or non … so that will filter out tools or models that don’t have the right copyright protection.”
Avid’s Rosica believes guard rails are needed in various areas. “As a creative industry we have to protect the role of creative people,” Rosica said. “AI is not a replacement for creativity. AI must be brought to bear to help creativity.“
Cioni said, “we need guard rails at some level, but I don’t think we need guard rails at all levels,” adding that “disposable” Tik Tok content may not need the same rules as Hollywood in terms of, for instance, how AI content is sourced.
But Cioni is ultimately a tech optimist, seeing potential upsides for actors who have worries about AI. “If an actor chooses to license themself to a movie, they can actually make their schedules more flexible,” he suggested. “They can do scenes that they have to be in person and they can do scenes where they don’t.”
Concluded Cioni: “I think it’s totally achievable that AI can live completely in harmony with content creators, and visionaries and storytellers because I don’t think AI is here to actually tell the stories for us … telling a story like Eternal Sunshine of the Spotless Mind–AI isn’t going to do that.”