In “The Human Race,” a short produced by visual effects house The Mill for Chevrolet, the automaker’s new 2017 Camaro ZL races a futuristic Chevy concept car, one driven by a racecar driver and the other by artificial intelligence. This short premiered at the Game Developers Conference to showcase how the car was created via real-time rendering, with the help of the Unreal game engine. Unreal maker Epic Games CTO Kim Libreri demonstrated how aspects of the movie could be changed in real-time, while it was playing.
Variety reports that Chevrolet marketing director Sam Russell made changes such as the color of the car switching in real-time and, then, the entire make of the car also switching, again in real-time.
“That was not a regular movie,” said Libreri. “That was a real-time generated movie.”
The Unreal game engine made it work, combined with Cyclops, a virtual production toolkit developed at The Mill for use in commercial shoots. For shoots where the automobile is so top secret that it can’t be seen on streets, The Mill has used the Blackbird, a stand-in car it created that features cameras and sensors to record all the driving information and “an electric engine that can be programmed to emulate the torque of any of the cars it is supposed to represent.” The Mill uses data gathered by the Blackbird to switch out to the car being advertised.
“The Human Race” was the first time that the Blackbird was used for real-time rendering, says Variety, and The Mill’s global director of emerging technology Boo Wong says the company plans to continue to use it again “as soon as possible on as many projects as possible.”
Epic will soon release a version of Unreal to support real-time rendering later this year, but, says Wong, filmmakers will have to adapt to the fact that real-time “allows for much less cleanup after the fact.” “Our artists have to be incredibly disciplined,” she said. “There is a bit of a mind shift.”
At first, studios may use real-time effects for on-set visualizations, “while still relying on traditional rendering after the fact for the finished project.” In this way, “any images generated during a shoot could be reused for other projects, including AR or VR assets,” said Epic general manager Marc Petit. “We don’t like to throw any pixels away.”
The potential to use this technology for truly interactive digital media could be “one of the biggest upsides of real-time effects.”