NAB 2017: Pre-Conference Sessions Examine Virtual Reality

It was clear from the SMPTE Future of Cinema sessions and the Post Production World sessions on VR that the NAB community has moved beyond defining virtual reality to how to address specific challenges and questions. Industry leaders gathered in Las Vegas to discuss the latest in VR production and post production, covering areas such as audio, video, hardware and more. Discussions during the pre-NAB weekend sessions also addressed compelling issues related to augmented reality, artificial intelligence, deep learning — even ethics, PR and marketing.

Gary Adcock and Blair Paulsen, who each have 5 years of VR production experience, had prepared over an hour’s worth of slides for their Understanding VR Hardware session. They were repeatedly interrupted and redirected by questions from the audience on such nuts and bolts topics as how to synch the cameras in a 360 rig, how to select and mix lenses, what microphone configurations are best for specific shoot environments and for audio manipulation in post, and how to decide where and at what height to place the VR camera rig.


In the AR/VR/AI and Deep Learning session, Lionel Oisel, Technicolor’s principal scientist, said that the production experience will have to become much more intuitive and streamlined before he expects production costs to come down and broad industry adoption of VR.

Mach1 CEO Jacqueline Bosnjak pointed out that everything about VR is evolving rapidly. Planning the project and the workflow is good, she said, but be prepared to constantly iterate the plan up to the point where you deliver the project.

Oisel also mentioned that aligning a press release with a product release can be problematic due to updates in the VR platforms for which you are completing the project. Your press release may state that your project will work on a number of named platforms, but an update to any one of them may prevent your project from running on that named platform — and trigger a PR problem.

Oisel, who is an expert in artificial intelligence, noted that eventually AI will allow the audience to dictate the narrative experience. He also pointed out that the AI could shape the experience to the individual user based on whatever it can determine about the user’s tastes and their current emotional state.


On Saturday afternoon, SMPTE’s Patrick Griffis moderated a discussion titled “Do Consumers Really Care About Artistic Intent?” That discussion bounced between the cases of the controlled theatrical environment and the uncontrolled consumer device environment for produced, non-interactive linear content. With AR, VR and AI, the discussion backtracks to what constitutes the work of art, who are the artists, and what are the definition and boundaries of the artistic intent.

In response to an audience question, no one on the panel was aware of any laws, guideline or codes of ethics that could help companies determine how best to manage consumer data to satisfy consumer concerns. VR, AR and AI provide a more responsive and personalized experience when they can track and respond to every head motion, every consumer response to a sound, every interactive choice, and any other bit of sensory and position data that is captured by the equipment while the consumer is interacting with the piece.

The Entertainment Technology Center is aware of some efforts around the globe to construct guidelines and codes of ethics from managing user data for VR, AR and AI for commercial (as opposed to academic) situations. The ETC could play a role in shaping and publicizing those efforts.

No Comments Yet

You can be the first to comment!

Sorry, comments for this entry are closed at this time.