NAB 2018: Machine-Learning Tools to Become Vital for Editing

USC School of Cinematic Arts professor and editor Norman Hollyn spoke at a conference on machine learning about ML tools available today and those that are imminent for editing film/TV content. Underlying the growing importance of ML-powered tools for editors, Hollyn pointed out that editors who resisted the advent of digital nonlinear editing in the 1990s exited the industry. “AI is bringing things into the post production world and if we don’t start to look at and embrace them, we’ll be ex-editors,” he said.

Acknowledging that artificial intelligence has been defined in numerous different ways, Hollyn gave his own version. “In plain English, it’s programming that learns to be better,” he said. There are several examples of tools today that rely on machine learning. “We see this today in transcription software, that’s become much cheaper than ever,” he said, pointing to Digital Anarchy and Digital Heaven as well as Avid ScriptSync.

Hollywood_Sign

“Other things AI has proven to be good at is image recognition,” he continued. “It recognizes color, objects, groups, and can begin to intuit images, based on what I throw at it. It pays attention to my needs.” Even so, he said, AI is “better on nouns than verbs,” noting the need for better AI tools for motion.

He described a joint project between Adobe and Stanford University around deep learning that “analyzes the audio, compares it to known texts and finds and links those lines in every single video sample given to it.”

“This is so much of the metadata that I need,” said Hollyn. The missing piece, he added, is emotion, dubbed sentiment analysis in AI circles; he pointed to a technology based on Microsoft Azure. “Sentiment analysis can make editing easier,” he said. “Imagine if rather than having a timeline that showed objects, I got to look at the emotional arc of a scene. That would be amazing to me.”

He also mentioned Affectiva, software that measures how the audience feels when it’s watching something.

“What I’m talking about is how we can use emotion detection and sentiment analysis to bring two things together,” he said. “As a filmmaker, I know what I want the audience to feel. The other side, for the audience, is, are you getting it? For me, this is just one of the great advantages of artificial intelligence.”

Hollyn brought Adobe principal solutions engineer Todd Burke to the stage to show off that company’s Sensei tool. “Sensei is a combination of computational creativity, experience intelligence and content understanding,” said Burke, who noted that some of its ML-powered capabilities are in other Adobe products, including Face-Aware Liquify and Lightroom auto-tagging in Creative Cloud, and Morph Cut and Face Tracking in Premiere Pro.

Related:
Artificial Intelligence “Is Not Going to Kill Your Job,” Says ‘Heathers’ Editor, The Hollywood Reporter, 4/10/18