NAB Panel to Cover Digital Deliverable & Archive Standards

The move to digital has created major challenges for content creators and owners. As part of the Birds of a Feather program at the NAB Show in Las Vegas next week, Erik Weaver of HGST and an all-star panel representing key industry working groups will discuss “Redefining Deliverable and Preservation Standards for Digital Times.” The panel is scheduled for Monday, April 8, 12:00-1:00 pm in N243. Guests will include Jesse Korosi of SIM Digital, Mary Yurkovic of MESA, Seth Levenson of The Entertainment Technology Center@USC, John Hurst of CineCert, and Craig Seidel of MovieLabs. Continue reading NAB Panel to Cover Digital Deliverable & Archive Standards

HPA Tech Retreat: Ways That M&E Is Embracing the Cloud

Western Digital global head of M&E/telco strategy Erik Weaver led a discussion among three other experts about where the media and entertainment industry is today with its slow-burn adoption of the cloud in production and post. Avid chief technology officer Tim Claman; Google Cloud global lead, entertainment industry solutions, Buzz Hays; and Microsoft global technology strategist Marco Rota described their perspectives and activities related to the various ways that media and entertainment companies have embraced the cloud. Continue reading HPA Tech Retreat: Ways That M&E Is Embracing the Cloud

Movie Studios Creating 3D Digital Scans to Preserve Actors

Next year, an Amy Winehouse hologram will be on tour to collect money for an eponymous charity. She’s the latest in a trend of deceased actors, from Carrie Fisher in “Rogue One: A Star Wars Story” to Paul Walker in the “Fast & Furious” franchise, appearing as virtual 3D replicas. Now, some actors and studios are getting a jump on post-life value by creating 3D digital scans. Industrial Light & Magic just scanned Ingvild Deila, who was Princess Leia in “Rogue One.” She calls it “a safe bet for the people with the money.” Continue reading Movie Studios Creating 3D Digital Scans to Preserve Actors

The Reel Thing: Machine Learning Powers Restoration Engine

During last week’s The Reel Thing at the Academy’s Linwood Dunn Theater in Hollywood, Video Gorillas managing director/chief executive Jason Brahms, formerly a Sony Cloud Media Services executive, and chief technology officer Alex Zhukov described the Bigfoot “Frame Compare” solution that leverages machine learning to speed up preservation, asset management, and mastering workflows. The engine, whose development dates back to 2007, relies on a proprietary, patented technology, frequency domain descriptor (FDD). Continue reading The Reel Thing: Machine Learning Powers Restoration Engine

Blackmagic Design Debuts DaVinci Resolve 15 with VFX Tools

Blackmagic Design’s release of DaVinci Resolve 15 adds Fusion, a visual effects tool often used in Hollywood films, to its professional-level color correction and audio editor. The full studio release of Resolve 15 is $300, but another, stripped down version that still includes the most important features is free. This compares to Adobe Creative Cloud, which costs more than $50 per month. DaVinci Resolve 15 is platform agnostic, running on macOS, Windows 10 and Linux, and offers four modules in one app. Continue reading Blackmagic Design Debuts DaVinci Resolve 15 with VFX Tools

Nvidia Ray-Tracing Technology a Quantum Leap in Rendering

At SIGGRAPH 2018, Nvidia debuted its new Turing architecture featuring ray tracing, a kind of rendering, for professional and consumer graphics cards. Considered the Holy Grail by many industry pros, ray tracing works by modeling light in real time as it intersects with objects. Ray tracing is ideal for creating photorealistic lighting and VFX. Up until now, ray tracing has not been possible to do because it requires an immense amount of expensive computing power, but Nvidia’s professional Turing card costs $10,000. Continue reading Nvidia Ray-Tracing Technology a Quantum Leap in Rendering

Nvidia Quadro RTx Chips Offer AI and Real-Time Ray Tracing

Nvidia unveiled new Turing architecture during a keynote at SIGGRAPH 2018 as well as three new Quadro RTx workstation graphics cards aimed at professionals. Nvidia dubs the Turing architecture as its “greatest leap since the invention of the CUDA GPU in 2006.” The RTx chips are the first to use the company’s ray tracing rendering method, which results in more realistic imagery. Also at SIGGRAPH, Porsche showed off car designs accomplished with Epic Games’ Unreal engine and Nvidia’s RTx chips. Continue reading Nvidia Quadro RTx Chips Offer AI and Real-Time Ray Tracing

NAB 2018: Machine Intelligence Toolsets in Video Workflows

Although using AI and machine learning tools in production may remain a lofty goal for some, such tools are already in use in some video workflows, from dailies through mastering. Moderated by Netflix coordinator, production technologies Kylee Peña, a panel discussion described the tools available and how they’re being used in real world applications. Google senior cloud solutions architect Adrian Graham described his company’s now-open sourced TensorFlow technology, and how it’s being used by the M&E industry. Continue reading NAB 2018: Machine Intelligence Toolsets in Video Workflows

NAB 2018: Artificial Intelligence Tools for Animation and VFX

Tools powered by artificial intelligence and machine learning can also be used in animation and visual effects. Nvidia senior solutions architect Rick Grandy noted that the benefit of such tools is that artists don’t have to replicate their own work. That includes deep learning used for realistic character motion created in real-time via game engines and AI, as well as a phase-functioned neural network for character control, whereby the network can be trained by motion capture or animation. Continue reading NAB 2018: Artificial Intelligence Tools for Animation and VFX

Startup Using AI to Help Create Effects for Movies, TV, Games

Palo Alto-based startup Arraiy is developing methods for automating part of the often-tedious process of producing visual effects for movies, TV shows and video games. “Filmmakers can do this stuff, but they have to do it by hand,” said CTO Gary Bradski, who has worked with tech companies such as Intel and Magic Leap. The Arraiy team, led by Bradski and CEO Ethan Rublee, “are building computer algorithms that can learn design tasks by analyzing years of work by movie effects houses,” reports The New York Times. “That includes systems that learn to ‘rotoscope’ raw camera footage, carefully separating people and objects from their backgrounds so that they can be dropped onto new backgrounds.” Continue reading Startup Using AI to Help Create Effects for Movies, TV, Games

Adobe Experiments With Easy Object Removal Tool for Video

Adobe’s research team is working on a visual effects tool, codenamed Cloak, for easy and economical removal of rigs, power lines and other unwanted parts of an image. The tool is similar to Photoshop’s content-aware fill feature that lets the user select and delete unwanted elements in the image, with “intelligent” software filling in the missing background behind them. Doing the same thing with video, however, is more challenging, which is why Cloak is still in an experimental stage, with no release date slated. Continue reading Adobe Experiments With Easy Object Removal Tool for Video

Sony Pictures Masters Classic Films in High Dynamic Range

At AMIA’s The Reel Thing conference in Hollywood, Sony Pictures Entertainment senior vice president of technology for production and post production Bill Baggelaar presented a session on HDR video mastering for classic cinema. He first hoped to dispel myths about high dynamic range. “I’ve heard that you need sunglasses to watch HDR, that filmmakers will hate it and that it will be too hard to deliver,” he said. “People also worry that there are too many formats, with HDR10, Dolby Vision, HDR10+ and HLG.” Continue reading Sony Pictures Masters Classic Films in High Dynamic Range

NAB 2017: ETC Panel Tells the Producer’s Perspective on VR

The final panel at ETC’s conference on VR/AR convened producers who have worked on virtual reality projects. Producers Guild of America vice president, new media council John Canning moderated the discussion with producers from ETC@USC, StoryTech Immersive, Digital-Reign and The Virtual Reality Company. StoryTech Immersive president/chief storyteller Brian Seth Hurst spoke about his experiences creating “My Brother’s Keeper,” a 360 spin-off of PBS’s “Mercy Street.” “We were able to get close and intimate with our actors,” he said. Continue reading NAB 2017: ETC Panel Tells the Producer’s Perspective on VR

Epic Games Demos Real-Time Effects for New Branded Short

In “The Human Race,” a short produced by visual effects house The Mill for Chevrolet, the automaker’s new 2017 Camaro ZL races a futuristic Chevy concept car, one driven by a racecar driver and the other by artificial intelligence. This short premiered at the Game Developers Conference to showcase how the car was created via real-time rendering, with the help of the Unreal game engine. Unreal maker Epic Games CTO Kim Libreri demonstrated how aspects of the movie could be changed in real-time, while it was playing. Continue reading Epic Games Demos Real-Time Effects for New Branded Short

RadicalMedia and Uncorporeal Develop Hologram Experience

RadicalMedia has been working on a project to present “great people” as holograms in venues optimized for augmented reality. Although much of the project is under wraps, more became clear recently when RadicalMedia partnered with Uncorporeal, a volumetric capture startup developing technology to create human holograms that can be used in VR or AR content. Headed by Sebastian Marino, formerly visual effects supervisor on “Avatar,” Uncorporeal’s eight staffers are veterans of Lucasfilm, Weta Digital and Electronic Arts. Continue reading RadicalMedia and Uncorporeal Develop Hologram Experience