October 29, 2018
At the Unite 2018 developers conference last week, Unity Technologies’ head of cinematics, Adam Myhill, unveiled CineCast, a synthetic co-director for filming video games that has implications for narrative storytelling and sports broadcasts of all kinds. Myhill, with the help of four players and a stand-in director — professional gamer Stephanie Harvey — demoed the CineCast mode for “GTFO,” a first-person shooter and the first property to use CineCast. Under Harvey’s watchful eye, CineCast automatically and in real-time chose the best and highest quality shot to move the action forward, with Harvey making only a few, on-the-fly adjustments.
CineCast works with Unity’s real-time engine, with Cinemachine, which functions like a synthetic camera operator generating shots and with story manager, a contextual analysis tool co-developed with the French research institute, Inria.
Taken together, these technologies solve a host of challenges such as: Which character should I follow? What is the most interesting vantage point? And, what’s the ideal shot given the situation? Manual override is always possible, and operators are able to select alternative subjects, different lenses, etc.
As Myhill described it, “CineCast allowed her (Harvey) to direct this story all by herself. It’s like having a synthetic production crew inside the game working to get the best shots… The magical part is that we delay CineCast by three seconds, but don’t delay Cinemachine, so the cameras can see into the future and we can cut and show things just before they happen.”
In a private interview the day after his presentation, Myhill explained that CineCast and story manager and their ability to deliver a propitious shot dynamically and function contextually is serving as an entrée into exploring the application of machine intelligence to storytelling.
“This is a very rich machine learning space,” said Myhill. “One of the things we are interested in is the idea of correlating human emotional response to content … and then being able to use the data to direct the narrative or to change content … to derive a desired emotional outcome in the viewer.”
For Myhill, the goal is to procedurally generate the most emotionally resonant narrative possible for the user based on analytics from similar demographics. The ability to produce content to make people feel a specific feeling is not something that Myhill takes lightly. “I don’t know if this is super-cool, the end of humanity, or the greatest holodeck. Probably all of the above,” he said ruefully.
According to Myhill, CineCast’s and story manager’s technology will also have implications for sports broadcasting, beyond video games. Myhill sees a future where stadium photography is automated, and instant replays are computer generated using volumetrically captured data and virtual camera views. Based on a confidential technology that he has been privy to recently, Myhill predicts that we will see multi-view, CG replays “very soon.”
For more information about CineCast and Unite 2018, check out the Unity Blog.