Must-See VR at Sundance: Volumetric Capture, Empathic Tales

At the Sundance Film Festival, the latest in virtual reality appears at the New Frontier, the Sundance Institute’s showcase for virtual reality, immersive cinematic works, and media lab innovations. 2016 is no exception, and the buzz is already out about some of the must-see virtual reality movies and experiences. Among them, “The Wasteland” from 8i is one recommendation. Other notable VR experiences at Sundance include “American Bison,” “Kiya” and “Wave of Grace.” Continue reading Must-See VR at Sundance: Volumetric Capture, Empathic Tales

CES 2016: Virtual Reality Headgear, Platforms, Ecosystems

Virtual reality began to dominate headlines last year at CES, and that drumbeat will only grow louder this year. Falling at the beginning of the proclaimed release window for two of the flag bearers of the VR renaissance, Oculus and Sony, and accompanied by a bumper crop of hardware and software suppliers, VR is ready to emerge from the shadows of the early adopters and meet the general consumer. We expect to see innovative products in January from companies such as Google, GoPro, HTC, NextVR, Sixense, Yezz and others. Beyond gaming and 360-degree video, we’ll be watching for new approaches to live streaming sports and music events. Continue reading CES 2016: Virtual Reality Headgear, Platforms, Ecosystems

CES 2016: VR, Game Platforms, eSports Coming to Las Vegas

The increasing interest in virtual reality and rise of the app stores are expected to help make gaming a hot topic at January’s CES. Coming off its crowd-pleasing debut at the show last year, and with its consumer launch only months away, expect Oculus to monopolize the floor. And with it comes an array of manufacturers hoping to add a new dimension to the immersive gameplay experience with new peripherals. Meanwhile, the widespread adoption of game systems as entertainment hubs has created a new pipeline for indie developers to reach consumers. And let’s not forget the rise of eSports. Turner Broadcasting certainly hasn’t, and will be bringing its largest-ever showcase to prove it. Continue reading CES 2016: VR, Game Platforms, eSports Coming to Las Vegas

SIGGRAPH 2015: Virtual Production, Cousin of Virtual Reality

At SIGGRAPH 2015, Autodesk executives David Morin and Ben Guthrie described virtual production, its relationship with virtual reality and some newly released tools from their company to aid in the process. Virtual production began with Peter Jackson’s “Lord of the Rings,” got a bump of recognition with “Avatar,” and has been used on many films since. According to Morin and Guthrie, the process, which lets filmmakers create virtual worlds in-camera and composite CG and live action on set, is achieving momentum. Continue reading SIGGRAPH 2015: Virtual Production, Cousin of Virtual Reality

Industrial Light & Magic Creates VR/AR Projects in ILMxLAB

Earlier this summer, we reported that Industrial Light & Magic was launching its new ILMxLAB division to develop virtual reality and augmented reality experiences for movie fans. The experimental division has now shown off three of its VR and AR proof of concept projects. In addition, ILMxLAB houses a team from Walt Disney Imagineering that’s working on futuristic Disneyland attractions. None of it would be possible without ILM’s unique blend of creative staff, cutting edge technology and years of expertise. Continue reading Industrial Light & Magic Creates VR/AR Projects in ILMxLAB

SIGGRAPH: Faceware Unveils Live Capture for Gaming Engine

At SIGGRAPH 2015 in Los Angeles, Faceware Technologies, which creates markerless 3D facial motion capture solutions, demonstrated its Faceware Live plugin for Epic Games’ Unreal Engine 4. With the plugin, developers for the UE4 will be able to capture facial movements with any video source and apply them immediately to digital characters. The Unreal Persona animation system displays the facial animation that takes place in real-time. The plugin was shown behind closed doors at SIGGRAPH. Continue reading SIGGRAPH: Faceware Unveils Live Capture for Gaming Engine

Microsoft Details How to Shoot for its HoloLens AR Headset

Microsoft’s HoloLens augmented reality headset allows video — which can be streamed over the Internet — to be viewed from any angle, combining the real world with computer-generated imagery. Whereas a digital object can be rendered in 3D and easily shown from any angle, live action isn’t so accommodating. To that end, the Silicon Valley company just came out with a document giving specific directions on how to capture and handle live action footage for use with its AR headset.

Continue reading Microsoft Details How to Shoot for its HoloLens AR Headset

Digital Domain and Immersive Media Join Forces for VR Content

Digital Domain Holdings Ltd. and Immersive Media are launching a joint venture called IM360. According to the two companies, IM360 plans to produce immersive content and services, including virtual reality content, by combining Digital Domain’s CGI and motion capture expertise with Immersive’s 360-degree video hardware and software. The 360-degree video tech is already being used for live-streaming VR content, and the software can send video to devices including tablets, smartphones and VR headsets. Continue reading Digital Domain and Immersive Media Join Forces for VR Content

Startup Demos New Eye Tracking Virtual Reality HMD at CES

Japanese startup FOVE has developed a virtual reality head-mounted display with built-in eye tracking. The eye tracking enables the software to render the areas where the viewer is not looking with less precision than the area where the viewer is staring, allowing for more efficient CPU/GPU resource utilization. The rendered image was very clear, and the eye tracking worked well. When the headset goes to market, a FOVE rep expects the HMD with headtracking to sell for $400-$450. Continue reading Startup Demos New Eye Tracking Virtual Reality HMD at CES

Virtual Reality Helmet Could Redefine In-Flight Entertainment

The United States Patent and Trademark Office granted leading aircraft manufacturer Airbus a patent in August for headrests that include helmets attached to a carrier. The headrests will relieve passenger stress by providing entertainment in addition to “sensorial isolation with regard to the external environment.” They will play immersive multimedia content on glasses with display screens that are capable of “holographic projection mode,” according to the patent. Continue reading Virtual Reality Helmet Could Redefine In-Flight Entertainment

Kickstarter-Funded Lynx 3D Camera Works as Advertised

Since a group of students from the University of Texas unveiled the Lynx A 3D camera project on Kickstarter earlier this year, it has raised far beyond the project’s $50,000 fundraising goal. And as of last week, Lynx Laboratories was getting ready to ship out the first cameras. At the DEMO Mobile SF event, Engadget had an opportunity to demo a prototype, watching firsthand as the camera scanned a person’s head in real time. Continue reading Kickstarter-Funded Lynx 3D Camera Works as Advertised

Gollum Actor Andy Serkis on Changes in Motion-Capture Technology

In a video interview with Wired, actor Andy Serkis (who plays the computer-generated character of Gollum in the popular “Lord of the Rings” trilogy and the upcoming “The Hobbit”) speaks about his recurring role and about advances in motion-capture technology.

Since the 2001 release of “Lord of the Rings,” motion capture has changed significantly, “bridging some of the ‘disconnect’ [Serkis] felt while filming on separate live-action and motion-capture stages for the original trilogy,” according to Wired.

Serkis recently co-founded a London-based digital-effects house called The Imaginarium, which specializes in motion capture. In the 2-minute interview, he talks about how developments have changed his acting experience and what it was like to return to the role of Gollum.

“It’s still in its infancy in terms of where it’s going to go and the ability that it gives an actor to transform, while retaining a real emotional sort of truth” says Serkis of motion-capture technology. “No matter how big, or wacky, or abstract the design of a character, it still is always rooted in this emotional, truthful actor’s performance.”

SoftEther Develops 3D Motion-Capture Figure

  • Japan-based SoftEther has developed a 3D motion-capture figure dubbed Quma, designed to create 3D computer graphics and animations.
  • Quma is essentially a doll with sensors on each of its joints that allows a 3D artist to articulate the motion of characters and capture the positioning in a more intuitive manner (hold up an arm, for example, and the 3D figure on your computer screen will mimic the action in real-time).
  • The figure simply plugs into a USB port (no drivers or external power required).
  • 3D CG applications for Quma may include video games, robot applications, training and education simulations.
  • A release date and pricing has not yet been announced. The TechCrunch post features a video demo of the figure in action.

Motion-Capture Technology Improved for Shooting Exteriors

  • Twentieth Century Fox’s “Rise of The Planet of The Apes,” premiering next month, made extensive use of performance capture techniques in its exterior shots.
  • Visual effects house Weta Digital, that developed motion capture techniques for “Avatar,” considers this the next step in the technique’s evolution.
  • “Avatar” was shot largely on enclosed sound stages, while “Rise” shot motion-capture performances on exterior sets.
  • Weta Digital created a portable performance capture rig teamed with special LED markers placed on the actors, enabling the production to take place in broad daylight.
  • “It was like the final step of mixing live-action and digital filmmaking,” explains senior visual effects supervisor Joe Letteri.
  • Actor Andy Serkis, a veteran of several performance-capture films, is developing a studio and academy to teach the technique, with the goal of making it more affordable and accessible even to lower-budget films.

Animation Technology Behind L.A. Noire Game Draws Attention

  • Depth Analysis is getting high praise for the 3D motion-capture technology it used in rendering facial expressions for the new L.A. Noire video game. The Australia-based company is working to perfect a full-body system that will let film directors “drop” actors into their movies.
  • L.A. Noire is the latest game from the Rockstar Games label (Take-Two Interactive Software). Team Bondi and Depth Analysis have been earning positive reviews for the game’s highly stylized, immersive and cinematic production design (check out the trailer here).
  • Development on the MotionScan 3D motion-capture system, which uses 32 HD cameras positioned at different levels to capture and create a 3D model, began in 2004. The infrastructure, pipeline and capture rig were all developed from scratch.
  • Depth Analysis has found success in realistically recreating the detail of actors’ facial performances for its video game and plans to use it more for capturing full-body performances.
  • The company has also been demonstrating the work-in-progress system to filmmakers. L.A. Noire writer-director Brendan McNamara explains that the appeal of the Depth Analysis system is its ability to drop actors into virtually any setting.
  • For those interested in a visual demonstration, CNET has posted a 6-minute GameSpot interview with Oliver Bao, head of R&D for Depth Analysis.
  • According to Bao (in an AWN interview last week): “We’ve managed to reproduce lifelike performances of actors. Getting the data compressed to fit game discs and render back at decent speed and quality have been reasons why this was not possible before. We’ve demonstrated that what you see is what you get; actors have their performances reproduced faithfully to the point that you can lip read what they’re saying in L.A. Noire. This is the first time we’ve allowed gamers to be able to enjoy believable acting on a console.”