NVIDIA LIGHT FIELD STEREOSCOPY VR

Forums: NVIDIA LIGHT FIELD STEREOSCOPY VR
A public forum for discussing and asking questions about OctaneVR.

NVIDIA LIGHT FIELD STEREOSCOPY VR

Postby Rikk The Gaijin » Fri Jan 29, 2016 4:24 am

Rikk The Gaijin Fri Jan 29, 2016 4:24 am
Image

At the recently held VRLA Expo, NVIDIA grabbed a lot of attention by demonstrating a research prototype of its ‘Light Field Display’ project. Originally demonstrated at the last year’s Siggraph, the project explores another angle at tackling the problem of motion sickness and depth perception. We contacted NVIDIA’s Brian Burke who followed up with the following statement:

“What we showed at VRLA is a research prototype built in collaboration with Stanford University to explore how future lightfield technologies can improve the VR experience. NVIDIA regularly conducts research on future technologies to help the industry find solutions to tough visual computing challenges. The prototype was first shown at Siggraph 2015, and based on interest from the industry, we showed it again at VRLA.“

Thanks to NV PR team, we received more information about this research project, which we can share with you in this article. First and foremost, this project was a small team effort, lead by Fu-Chung Huang (Computer Scientist, NVIDIA), David Luebke (co-founder of NVIDIA Research), and Gordon Wetzstein, assistant professor at Stanford University.

Image

This project is based on using the Brewster’s Stereoscope and ‘augment it with modern factored light field synthesis via stacked liquid crystal panels.’ The concept works on resolving issues with the human eyes, most notably the resolution and retinal blur. If you managed to experience Samsung’s GearVR, Oculus Rift or HTC Vive, you could experience first-hand that you can still see the physical pixels, which reduces the immersion experience and could induce motion sickness.

In order to understand the proposed solution, we need to understand the problem(s) first.

Why Do We Get Motion Sickness?

Approximately 20-25% of human population suffers from kinetosis, or motion sickness. The problem lies in the fact that human brain has built-in defenses against neurotoxins, as it processes and syncs all the inputs it receives. Kinetosis is induced when brain receives differing information from inner ear – sensing a motion – while eyes / ears ‘tell’ the brain that everything is still. One of our colleagues suffers from motion sickness and so far, he’s unable to use 3D, VR or AR glasses, since all introduce vomit-inducing feeling. John Carmack speaks of 20ms as the magical barrier we need to reach in order to achieve ‘believable reality’, i.e. a minimum of 90 frames per second. However, that does not solve the following problem – depth perception.

How To Deal With Depth Perception?

Second issue that all visual technologies – 3D, VR and AR – face is Depth perception. Similar to motion sickness, depth perception relies on the speed of eyes relating visual input to the brain for processing. When one eye is too slow in relaying the information, a person won’t be able to see 3D / VR /AR content as the stereoscopic technology already refreshed the content. This syndrome is also known as the ‘lazy eye’. Furthermore, a person that does not distinguish 3D / VR / AR content might suffer from issues with binocular vision or depth perception.

Image
Did you ever wonder why this information is mandatory? ‘Objects in Mirror are closer than they appear’ Credit: Superior Mirrors.

Depth perception is perhaps the biggest reason for automotive accidents. Persons that suffer from lack of depth perception might cause an accident without knowing the reasons for the accident – position of a moving vehicle to the side or behind of his or her vehicle, with potentially fatal consequences. Sadly, candidates for driver tests are not tested for depth perception in developed world, yet alone in developing one. This leads to numerous stereotypes and generalizations about drivers of different backgrounds, but little or no effort is being made to correct this in real world. In world of technology, situation is vastly different – the whole industry is trying to solve the problem.

NVIDIA And Stanford Collaboration

Image
Light Field Stereoscope HMD schematics

NVIDIA’s concept resembles Brewster’s stereoscope moved into 21st century. The front of the HMD features electronics which drives backlight and two transparent LCDs which are set one in front of another.

Image
Light Field Stereoscopic HMD in detail

As you can see on the pictures above, Light Field Stereoscopic HMD features two PCBs driving two displays. Both PCBs are mounted on the same surface as the LED backlight, two spacers and two LCD displays. Project was designed to address the following challenge:

“To facilitate comfortable long-term experiences and wide-spread user acceptance, however, the vergence-accomodation conflict inherent to all stereoscopic displays will have to be solved.”

The prototype utilizes NVIDIA GPU capabilities to the full extend, rendering the image on two screens and allowing for low-latency orientation trackers and background lighting to improve retinal blur quality. Given that this prototype and proposed solutions are generating significant traction in the industry, both NVIDIA and Stanford are continuing with the research. However, implementation is still some ways away. While 2017-18 should see the arrival of second and third generation VR / AR HMD’s, Light Field technology utilization is doubtful, unless the project reaches production maturity.

We’re quite certain that Siggraph and VRLA are just the beginning for experimentation with Light Field, especially for VR and 360 degree movies, which are gaining in popularity as Google’s Jump open VR camera project with GoPro, as well as Nokia’s OZO, a professional grade VR camera.
Rikk The Gaijin
Licensed Customer
Licensed Customer
 
Posts: 1528
Joined: Tue Sep 20, 2011 2:28 pm
Location: Japan

Return to OctaneVR Questions & Discussion


Who is online

Users browsing this forum: No registered users and 26 guests

Thu Apr 25, 2024 12:26 am [ UTC ]