Nvidia Quadro RTx Chips Offer AI and Real-Time Ray Tracing

Nvidia unveiled new Turing architecture during a keynote at SIGGRAPH 2018 as well as three new Quadro RTx workstation graphics cards aimed at professionals. Nvidia dubs the Turing architecture as its “greatest leap since the invention of the CUDA GPU in 2006.” The RTx chips are the first to use the company’s ray tracing rendering method, which results in more realistic imagery. Also at SIGGRAPH, Porsche showed off car designs accomplished with Epic Games’ Unreal engine and Nvidia’s RTx chips. Continue reading Nvidia Quadro RTx Chips Offer AI and Real-Time Ray Tracing

NAB 2018: Machine Intelligence Toolsets in Video Workflows

Although using AI and machine learning tools in production may remain a lofty goal for some, such tools are already in use in some video workflows, from dailies through mastering. Moderated by Netflix coordinator, production technologies Kylee Peña, a panel discussion described the tools available and how they’re being used in real world applications. Google senior cloud solutions architect Adrian Graham described his company’s now-open sourced TensorFlow technology, and how it’s being used by the M&E industry. Continue reading NAB 2018: Machine Intelligence Toolsets in Video Workflows

NAB 2018: Artificial Intelligence Tools for Animation and VFX

Tools powered by artificial intelligence and machine learning can also be used in animation and visual effects. Nvidia senior solutions architect Rick Grandy noted that the benefit of such tools is that artists don’t have to replicate their own work. That includes deep learning used for realistic character motion created in real-time via game engines and AI, as well as a phase-functioned neural network for character control, whereby the network can be trained by motion capture or animation. Continue reading NAB 2018: Artificial Intelligence Tools for Animation and VFX

Startup Using AI to Help Create Effects for Movies, TV, Games

Palo Alto-based startup Arraiy is developing methods for automating part of the often-tedious process of producing visual effects for movies, TV shows and video games. “Filmmakers can do this stuff, but they have to do it by hand,” said CTO Gary Bradski, who has worked with tech companies such as Intel and Magic Leap. The Arraiy team, led by Bradski and CEO Ethan Rublee, “are building computer algorithms that can learn design tasks by analyzing years of work by movie effects houses,” reports The New York Times. “That includes systems that learn to ‘rotoscope’ raw camera footage, carefully separating people and objects from their backgrounds so that they can be dropped onto new backgrounds.” Continue reading Startup Using AI to Help Create Effects for Movies, TV, Games

Adobe Experiments With Easy Object Removal Tool for Video

Adobe’s research team is working on a visual effects tool, codenamed Cloak, for easy and economical removal of rigs, power lines and other unwanted parts of an image. The tool is similar to Photoshop’s content-aware fill feature that lets the user select and delete unwanted elements in the image, with “intelligent” software filling in the missing background behind them. Doing the same thing with video, however, is more challenging, which is why Cloak is still in an experimental stage, with no release date slated. Continue reading Adobe Experiments With Easy Object Removal Tool for Video

Sony Pictures Masters Classic Films in High Dynamic Range

At AMIA’s The Reel Thing conference in Hollywood, Sony Pictures Entertainment senior vice president of technology for production and post production Bill Baggelaar presented a session on HDR video mastering for classic cinema. He first hoped to dispel myths about high dynamic range. “I’ve heard that you need sunglasses to watch HDR, that filmmakers will hate it and that it will be too hard to deliver,” he said. “People also worry that there are too many formats, with HDR10, Dolby Vision, HDR10+ and HLG.” Continue reading Sony Pictures Masters Classic Films in High Dynamic Range

NAB 2017: ETC Panel Tells the Producer’s Perspective on VR

The final panel at ETC’s conference on VR/AR convened producers who have worked on virtual reality projects. Producers Guild of America vice president, new media council John Canning moderated the discussion with producers from ETC@USC, StoryTech Immersive, Digital-Reign and The Virtual Reality Company. StoryTech Immersive president/chief storyteller Brian Seth Hurst spoke about his experiences creating “My Brother’s Keeper,” a 360 spin-off of PBS’s “Mercy Street.” “We were able to get close and intimate with our actors,” he said. Continue reading NAB 2017: ETC Panel Tells the Producer’s Perspective on VR

Epic Games Demos Real-Time Effects for New Branded Short

In “The Human Race,” a short produced by visual effects house The Mill for Chevrolet, the automaker’s new 2017 Camaro ZL races a futuristic Chevy concept car, one driven by a racecar driver and the other by artificial intelligence. This short premiered at the Game Developers Conference to showcase how the car was created via real-time rendering, with the help of the Unreal game engine. Unreal maker Epic Games CTO Kim Libreri demonstrated how aspects of the movie could be changed in real-time, while it was playing. Continue reading Epic Games Demos Real-Time Effects for New Branded Short

RadicalMedia and Uncorporeal Develop Hologram Experience

RadicalMedia has been working on a project to present “great people” as holograms in venues optimized for augmented reality. Although much of the project is under wraps, more became clear recently when RadicalMedia partnered with Uncorporeal, a volumetric capture startup developing technology to create human holograms that can be used in VR or AR content. Headed by Sebastian Marino, formerly visual effects supervisor on “Avatar,” Uncorporeal’s eight staffers are veterans of Lucasfilm, Weta Digital and Electronic Arts. Continue reading RadicalMedia and Uncorporeal Develop Hologram Experience

Researchers Develop Efficient Way to Render Shiny Surfaces

Computer scientists at UC San Diego have developed an efficient technique for rendering the sparkling, shiny and uneven surfaces of water, various metals and materials such as injection-molded plastic finishes. The team has created an algorithm that improves how CG software reproduces the interaction between light and different surfaces (known as “glints”), a technique the team claims is 100 times faster than current state-of-the-art methods, requires minimal computational resources, and is effective beyond still images to include animation. Continue reading Researchers Develop Efficient Way to Render Shiny Surfaces

ILMxLAB Debuts ‘Tatooine’ VR, Develops Darth Vader Projects

Darth Vader is the star of an upcoming Lucasfilm virtual reality project centered on “Star Wars.” Although the project is largely undefined at this point — it has no name, genre, or release date — what we do know is that the story will both reveal new details about Darth Vader’s background and try out some innovative storytelling techniques. Lucasfilm’s ILMxLAB, which accesses award-winning VFX facility Industrial Light & Magic, Skywalker Sound and the “Star Wars” story group, is developing the project. Continue reading ILMxLAB Debuts ‘Tatooine’ VR, Develops Darth Vader Projects

CryWorks: Disney, Pixar, ILM Vets Launch New VR Company

VFX and CGI veterans Euan Macdonald, Hans Uhlig and Kymber Lim have secured funding led by Michael Bay’s 451 Media Group, 500 Mobile Collective, and WI Harper Group to launch an immersive entertainment company called CryWorks, with plans to produce virtual and augmented reality experiences. “Although there are a few high-quality VR content pieces to date, most of them have little incentive for the viewer to keep tuning back in,” said Macdonald. “We see an opportunity to build the first VR broadcast network, partnering with other production companies and creating addictive, episodic experiences.” Continue reading CryWorks: Disney, Pixar, ILM Vets Launch New VR Company

NAB 2016: Sphericam and Liquid Cinema Look to Advance VR

Two companies at last week’s NAB Show, Sphericam and Liquid Cinema, are making interesting contributions to the advancement of VR Cinema. Sphericam is preparing to launch a 6-sensor, 4-microphone spherical camera the size of a baseball into the prosumer market. The camera can internally stitch at 30 fps and, with an attached PC, output 60 fps live video. Liquid Cinema has developed a comprehensive yet simple-to-use software package for editing VR footage, adding effects, and, most interestingly, re-establishing the director’s intent for where viewers should look at cut-points within the video. Continue reading NAB 2016: Sphericam and Liquid Cinema Look to Advance VR

Cloud Conference: Moving From Local to Cloud Infrastructure

ConductorIO VP of business development and operations Monique Bradshaw talked about the paradigm change from local, on-premise infrastructure to the cloud. “The paradigm shift means a fundamental change in approach of underlying assumption,” she said during an ETC Cloud Innovation Conference keynote at NAB. “We’re seeing a big change in the ways that companies are looking at their rendering.” In five years, she noted, 90 percent of respondents to a survey think they’ll have at least some of their rendering in the cloud, up from close to 60 percent today. Continue reading Cloud Conference: Moving From Local to Cloud Infrastructure

Cloud Conference: Challenges to Rendering VFX in the Cloud

Visual effects and rendering in the cloud was the topic of an ETC Cloud Innovation Conference panel at NAB 2016, moderated by Google Cloud Platform senior product manager Srikanth Belwadi. The scope of the issue was made clear by the fact that “The Good Dinosaur” required 110 million compute hours and 300 TB of active data space. Panelists from Thinkbox, Shotgun, Rodeo FX, Avere Systems, and ConductorIO discussed the challenges to producing VFX in the cloud — but also its inevitability.
Continue reading Cloud Conference: Challenges to Rendering VFX in the Cloud