August 20, 2015
Much has been written about the more obvious issues in viewing virtual reality. Top of the list is the motion sickness that some people get, a result of the mismatch between what they see and what they feel. But there’s another issue — an eye-focusing problem dubbed “vergence-accommodation conflict” — that is specific to virtual reality and is much more difficult to overcome than motion sickness. At SIGGRAPH 2015, scientists from Stanford and UC Berkeley described potential solutions.
According to Wired, Stanford University professor Gordon Wetzstein and UC Berkeley vision scientist Martin Banks both talked about the specifics of vergence-accommodation conflict and proposed some answers. First, a definition: Vergence is the movement of both eyes in different directions to maintain a binocular vision.
Closely connected to vergence, accommodation is when the lenses “accommodate” by focusing at different distances. “The visual system has developed a circuit where the two responses talk to each other,” says Banks.
When a user dons a virtual reality headset such as Oculus Rift or Samsung Gear VR, however, the 3D image, achieved by offsetting the right and left eye images, can decouple accommodation and convergence, with dire results: fatigue, discomfort and motion sickness.
At SIGGRAPH, Wetzstein and his colleagues, including Nvidia (which is a collaborative partner in the project), showed a new head-mounted display that lessens the vergence-accommodation conflict by creating light fields, “or 3D patterns of light rays that mimic light bouncing off objects in the real world.” The headset is based on two stacked LCDs and an algorithm that divides an image between them.
Others are at work at solving the same problem, including Banks and AR company Magic Leap. Oculus and Samsung also want to solve the problem but Wired notes that these companies’ current head-mounted displays rely on stereoscopic displays, “a dead end for solving the vergence-accommodation conflict.”