How Does VR Create the Illusion of Reality?

I’ve recently written a loose series of articles trying to explain certain technical aspects of virtual reality, such as what the lenses in VR headsets do, or why there is some blurriness, but I haven’t — or at least haven’t in a few years — tackled the big question:

How do all the technical components of VR headsets, e.g., screens, lenses, tracking, etc., actually come together to create realistic-looking virtual environments? Specifically, why do virtual environment in VR look more “real” compared to when viewed via other media, for example panoramic video?

The reason I’m bringing this up again is that the question keeps getting asked, and that it’s really kinda hard to answer. Most attempts to answer it fall back on technical aspects, such as stereoscopy, head tracking, etc., but I find that this approach somewhat misses the point by focusing on individual components, or at least gets mired in technical details that don’t make much sense to those who have to ask the question in the first place.

I prefer to approach the question from the opposite end: not through what VR hardware produces, but instead through how the viewer perceives 3D objects and/or environments, and how either the real world on the one hand, or virtual reality displays on the other, create the appropriate visual input to support that perception.

The downside with that approach is that it doesn’t lend itself to short answers. In fact, last summer, I gave a 25 minute talk about this exact topic at the 2016 VRLA Summer Expo. It may not be news, but I haven’t linked this video from here before, and it’s probably still timely:

As an added value for those who don’t like watching long videos, here’s a breakdown of the entire presentation into chapters:

  • 0:00 — Introduction (skip this)
  • 1:00 — Outline
  • 1:32 — How 3D vision works in the real world
  • 3:31 — How VR simulates real-world 3D vision
  • 4:36 — Handling user movement, or: why head tracking is necessary
  • 5:45 — Embedded (real) video showing head tracking in action
  • 6:25 — How head-mounted VR simulates real-world vision
  • 6:55 — Why head-mounted displays need lenses
  • 10:30 — Lens distortion and how to correct it
  • 12:12 — The need for precise calibration and configuration
  • 13:26 — Configuration parameter: eye relief
  • 13:42 — Configuration parameter: inter-pupillary distance (IPD)
  • 14:20 — Effects of mis-configuration, specifically wrong IPD
  • 15:41 — “Pupil swim” and eye tracking
  • 16:35 — Hardware requirements derived from perceptual process
  • 17:29 — Effects of high display (“motion-to-photon”) latency
  • 18:20 — Effects of “bad” VR: simulator sickness
  • 19:30 — Artificial locomotion and simulator sickness
  • 20:00 — Accommodation-vergence conflict
  • 24:24 — The End!

Leave a Reply

Your email address will not be published. Required fields are marked *