VR leaders gathered for day two of the Game Developers Conference in San Francisco last week. Many of the talks addressed techniques for dealing with a medium in which you present a story and a world to the ‘visitor,’ but you have limited control over how the visitor experiences it. The “Job Simulator” team created microstories bounded by story pinchpoints within a macrostory. The HBO “Westworld” VR and Baobab Studios teams rewarded visitors for taking actions that advance the story, but embedded triggers that advance the story when the visitor misses the cues. The “Trials on Tatooine” team learned that understanding and accommodating visitors with varying physical abilities can not only improve user experience design, but inform story development.
Amy Claussen opened the day by stating that we are transitioning our media experience from “they” to “I” through community and social interaction.
Eric Darnell, CCO, Baobab Studios, posited that “with VR storytelling we can turn audience empathy into action and compassion.” Baobab maintains control of the spine of the story. The visitor can have agency and be acknowledged or ‘thanked’ for furthering the story, or the visitor can act in other ways and the characters will advance the plot on their own.
Building in features to accommodate people with different abilities was the focus of the talk by ILMxLAB producer Hannah Gillis and engineer Ben Peck. They discussed creating the “Star Wars Trials on Tatooine” VR experience. The height of barriers in the battle sequence needed to be dynamically adjustable to accommodate the detected height of the sight line of someone in a wheelchair. People with red/green blindness needed additional cues to know what buttons to push. ‘Tourism mode’ was added so people who may not be able to navigate through the piece can get a sense of it. (A comprehensive list of guidelines is available online.)
These accessibility considerations can also inform story development. “Zootopia” used multiscale design for the different sized animals and their communities. (The complete ILMxLAB presentation deck is available online.)
Thomas Bedenk, a consultant at Exozet in Berlin, discussed a 2017 University of Vienna study that found that real life prosocial behavior decreases after being socially excluded by avatars. Another study found that 83 percent of introverts are better at forming relationships in VR than in the real world.
Bedenk postulated that in the future we would be mechanical Turks in VR worlds, jumping from experience to experience. The act of rapidly switching and populating landscapes will be a new form of VR experience.
Colin Foran, the creative lead on HBO’s “Westworld” VR experience, spoke about allowing users to create microstories within the macrostory that you want to tell. Don’t worry about narrative pacing, and don’t punish people for not advancing the story fast enough, he said. Give the person agency to start the next task. Foran uses a lot of avatar ‘dynamic stare and point’ to emotionally connect with visitors and direct their actions.
Owlchemy Labs CEO/chief scientist Alex Schwartz and CTO Devin Reimer discussed the process behind creating “Job Simulator” and the upcoming “Rick and Morty” VR experience. Everyone on the team contributes to the writing. Anyone passing the whiteboards can add elements and make edits. It is semi-structured chaos. They create an overarching story, but the tasks can be completed in any order between story pinch points. The Owlchemy team creates bundles of microstories bounded by story pinch points.
Google’s Rob Jagnow closed out the program. In their experiments with social VR, Google teams have found that personal space matters a great deal. Just as in the real world, people don’t like unwelcomed invasion of their personal space. He also noted that VR has a greater potential for impactful trolling and abuse than prior media. He recommends creating personal bubbles that can’t be invaded, and rewarding pro-social behavior.