KeckCAVES On Mars, pt. Oh-I-lost-count

Last weekend, we had yet another professional film crew visiting us to shoot video about our involvement in NASA’s still on-going Mars Science Laboratory (MSL, aka Curiosity rover) mission. This time, they were here to film parts of an upcoming 90-minute special about Mars exploration for the National Geographic TV channel. Like last time, the “star” of the show was Dawn Sumner, faculty in the UC Davis Department of Earth and Planetary Sciences, one of the founding members of KeckCAVES, and member of the MSL science team.

Unlike last time, we did not film in the KeckCAVES facility itself (due to the demise of our CAVE), but in the UC Davis ModLab. ModLab is part of an entirely different unit — UC Davis’s Digital Humanities initiative — but we are working closely with them on VR development, they have a nice VR environment consisting of two HTC Vive headsets and a large 4.2m x 2.4m screen with a ceiling-mounted ultra-short throw projector (see Figure 1), their VR hardware is running our VR software, and they were kind enough to let us use their space.

Figure 1: Preparation for filming in UC Davis’s ModLab, showing its 4.2m x 2.4m front-projected screen and ceiling-mounted ultra-short throw projector, and two Lighthouse base stations.

The fundamental idea here was to use several 3D models, created or reconstructed from real data sent back either by satellites orbiting Mars or by the Curiosity rover itself, as backdrops to let Dawn talk about the goals and results of the MSL mission, and her personal involvement in it. Figure 1 shows a backdrop in the real sense of the word, i.e., a 2D picture (a photo taken by Curiosity’s mast camera) with someone standing in front of it, but that was not the idea here (we didn’t end up using that photo). Instead, Dawn was talking while wearing a VR headset and interacting with the 3D models she was talking about, with a secondary view of the virtual world, from the point of view of the film camera, shown on the big screen behind her. More on that later. Continue reading

Welcome New VR Users!

Apparently, there were good sales numbers for VR equipment prior to the holiday season, and therefore a host of new VR users are coming in just about now. This meta-post collects a bunch of stuff I’ve written (or presented) in the past that might be of interest to some of those new users. These questions/answers are not hardware-specific, meaning they apply to any current-generation VR system (Oculus Rift, HTC Vive, all the Windows Mixed Reality headsets, PlayStation VR, …), and go beyond basic tech questions such as “how do I plug this in, install drivers, …).

There is one other issue for which I do not have a full article, but it’s quite important for new users: VR sickness (aka motion sickness, simulator sickness, …). Today’s VR headsets, at least the ones doing full head tracking (that means Rift/Vive et al., and not Gear VR, Oculus Go, Google Cardboard, …) should not cause VR sickness per se. These days, it is primarily caused by artificial locomotion in games or applications, as I explain in the second presentation I linked above.

The important message is: do not attempt to fight through VR sickness! If you try to stomach it out, it will only get worse. Stop using VR the moment you feel the first symptoms, take a long break, and then try again if you want to continue with the application/game that made you sick. If you try to power through repeatedly, your body might learn to associate sickness with VR, and that might cause you to get sick even when merely thinking about VR, or smelling the headset, or similar triggers. Just don’t do it.

That’s about it; now go ahead and enjoy your shiny new VR systems!

Want to Know More?

Here are a couple of other, more hardware-specific, topics:

Set-up Instructions for Vrui with HTC Vive Head-mounted Display

It’s been more than two years since the last time I posted set-up instructions for Vrui and HTC Vive, and a lot has changed in the meantime. While Vrui-5.0 and its major changes are still not out of the kitchen, the current release of Vrui, Vrui-4.6-005, is stable and works very well with the Vive. The recent demise of our CAVE, and our move towards VR headsets until we figure out how to fix it, have caused a lot of progress in Vrui’s set-up and user experience. The rest of this article contains detailed installation and set-up instructions, starting from where my previous step-by-step guide, “An Illustrated Guide to Connecting an HTC Vive VR Headset to Linux Mint 19 (“Tara”),” left off.

If you did not follow that guide and its prerequisite, “An Illustrated Guide to Installing Linux Mint 19 (“Tara”),” this one assumes that you already have:

  • a “gaming” or “VR ready” PC with a powerful Nvidia GeForce graphics card,
  • a full installation of a 64-bit Ubuntu-based Linux operating system, e.g., Ubuntu or Linux Mint, with the MATE desktop environment,
  • proprietary drivers for the Nvidia graphics card installed and working,
  • head-mounted display filtering disabled in the graphics card driver,
  • and a working installation of SteamVR.

If you use a Linux distribution that is not Ubuntu-based, such as my own favorite, Fedora, or another desktop environment such as Gnome Shell or Cinnamon, you will have to make some adjustments throughout the rest of this guide.

This guide also assumes that you have already set up your Vive virtual reality system, including its tracking base stations, and that your Vive headset is connected to your PC via HDMI and USB (I will publish a detailed illustrated guide on that part soon-ish). Continue reading

How Does VR Create the Illusion of Reality?

I’ve recently written a loose series of articles trying to explain certain technical aspects of virtual reality, such as what the lenses in VR headsets do, or why there is some blurriness, but I haven’t — or at least haven’t in a few years — tackled the big question:

How do all the technical components of VR headsets, e.g., screens, lenses, tracking, etc., actually come together to create realistic-looking virtual environments? Specifically, why do virtual environment in VR look more “real” compared to when viewed via other media, for example panoramic video?

The reason I’m bringing this up again is that the question keeps getting asked, and that it’s really kinda hard to answer. Most attempts to answer it fall back on technical aspects, such as stereoscopy, head tracking, etc., but I find that this approach somewhat misses the point by focusing on individual components, or at least gets mired in technical details that don’t make much sense to those who have to ask the question in the first place.

I prefer to approach the question from the opposite end: not through what VR hardware produces, but instead through how the viewer perceives 3D objects and/or environments, and how either the real world on the one hand, or virtual reality displays on the other, create the appropriate visual input to support that perception.

The downside with that approach is that it doesn’t lend itself to short answers. In fact, last summer, I gave a 25 minute talk about this exact topic at the 2016 VRLA Summer Expo. It may not be news, but I haven’t linked this video from here before, and it’s probably still timely:

Continue reading

Projection and Distortion in Wide-FoV HMDs

There is an on-going, but already highly successful, Kickstarter campaign for a new VR head-mounted display with a wide (200°) field of view (FoV): Pimax 8k. As I have not personally tried this headset — only its little brother, Pimax 4k, at the 2017 SVVR Expo — I cannot discuss and evaluate all the campaign’s promises. Instead, I want to focus on one particular issue that’s causing a bit of confusion and controversy at the moment.

Early reviewers of Pimax 8k prototypes noticed geometric distortion, such as virtual objects not appearing in the correct places and shifting under head movement, and the campaign responded by claiming that these distortions “could be fixed by improved software or algorithms” (paraphrased). The ensuing speculation about the causes of, and potential fixes for, this distortion has mostly been based on wrong assumptions and misunderstandings of how geometric projection for wide-FoV VR headsets is supposed to work. Adding fuel to the fire, the campaign released a frame showing “what is actually rendered to the screen” (see Figure 1), causing further confusion. The problem is that the frame looks obviously distorted, but that this obvious distortion is not what the reviewers were complaining about. On the contrary, this is what a frame rendered to a high-FoV VR headset should look like. At least, if one ignores lenses and lens distortion, which is what I will continue to do for now.

Figure 1: Frame as rendered to one of the Pimax 8k’s screens, according to the Kickstarter campaign. (Probably not 100% true, as this appears to be a frame submitted to SteamVR’s compositor, which subsequently applies lens distortion correction.)

Continue reading

3D Camera Calibration for Mixed-Reality Recording

Mixed-reality recording, i.e., capturing a user inside of and interacting with a virtual 3D environment by embedding their real body into that virtual environment, has finally become the accepted method of demonstrating virtual reality applications through standard 2D video footage (see Figure 1 for a mixed-reality recording made in VR’s stone age). The fundamental method behind this recording technique is to create a virtual camera whose intrinsic parameters (focal length, lens distortion, …) and extrinsic parameters (position and orientation in space) exactly match those of the real camera used to film the user; to capture a virtual video stream from that virtual camera; and then to composite the virtual and real streams into a final video.

Figure 1: Ancient mixed-reality recording from inside a CAVE, captured directly on a standard video camera without any post-processing.

Continue reading

Accommodation and Vergence in Head-mounted Displays

Why do virtual objects close to my face appear blurry when wearing a VR headset? My vision is fine!

And why does the real world look strange immediately after a long VR session?

These are another two (relates ones) of those frequently-asked questions about VR and head-mounted displays (HMDs) that I promised to address a while back.

Here’s the short answer: In all currently-available HMDs, the screens creating the virtual imagery are at a fixed optical distance from the user. But our eyes have evolved to automatically adjust their optical focus based on the perceived distance to objects, virtual or real, that they are looking at. So when a virtual object appears to be mere inches in front of the user’s face, but the screens showing images of that object are — optically — several meters away, the user’s eyes will focus on the wrong distance, and as a result, the virtual object will appear blurry (the same happens, albeit less pronounced, when a virtual object appears to be very far away). This effect is called accommodation-vergence conflict, and besides being a nuisance, it can also cause eye strain or headaches during prolonged VR sessions, and can cause vision problems for a short while after such sessions.

Now for the long answer.

Continue reading

Boom, Headshot

There was an article on Medium yesterday: My First Virtual Reality Groping. In it, a first-time VR/HTC Vive user describes how she was virtually groped by another player inside an online multi-player VR game, within the first three minutes of her first such endeavor, and how it ruined her experience and deeply disturbed her.

I do not know what to call player “BigBro442’s” behavior, but I do know that it is highly inappropriate, and toxic for VR as a whole. This, people, is why we can’t have nice things. This is by far not the first instance of virtual harassment or VR griefing that I’ve heard of, but it’s the one that got me thinking because of this comment on the article:

This is reality. The best we can do is educate, starting with articles like this.

No. That is not true. We can do better than that. Unlike reality, where someone might be assaulted inside their own home, or in some dark back alley, with no witnesses around or evidence left behind, this is virtual reality, which only exists as a sequence of ones and zeros on some Internet server. That server has absolute knowledge of anything that goes on anywhere inside the virtual world it maintains, like an omniscient Big Brother. If virtual harassment happens in virtual reality, maybe virtual reality needs virtual courts.

Here is a not-so-modest proposal, off the top of my head, using SteamVR/Steam as example platforms:

  • Any server maintaining a virtual world potentially used by more than one person at the same time keeps a ring buffer of each connected user’s avatar state for the last, say, five minutes. That’s not overly demanding: sampling a head tracker and two hand trackers at, say, 30 Hz, over five minutes, results in approx. 750kB of data total, per user.
  • The client user interface of any shared virtual environment contains a button in some easily accessible standard place, say in SteamVR’s overlay, to file a harassment complaint.
  • If a user (“Alice”) files a complaint, several things happen. Most importantly, the server immediately dumps the avatar state ring buffers of all connected (or recently connected) players to a file. Second, Alice is immediately charged a small fee, say $5, on the credit card associated with her Steam account. This is a micro-transaction, an existing Steam feature. The fee’s purpose is to discourage another form of harassment, namely filing frivolous complaints against innocent users.
  • Files generated by complaints, with personally identifying information redacted, will be reviewed by a peer group of humans. This might be done by appointed moderators, or might even be crowd-sourced.
  • If review determines that behavior contained in the 5-minute replay violates community standards, Alice will be refunded the fee she was charged, and offending user Bob’s Steam account will be temporarily suspended, say for one day on the first offense, starting either immediately or the next time Bob attempts to log in. And I mean Bob’s entire Steam account is suspended, not just his access to one particular server or shared VR application: Bob’s on time-out and can go read a book.
  • If review determines that the complaint was without merit, nothing happens to accused user Bob, and Alice is not refunded her fee. If Alice disagrees, she can raise the stakes by re-filing the same complaint for another $5 fee, the total $10 then being refundable or not, etc.
  • If review cannot reach agreement, or review does not happen within a reasonable time frame, Alice is refunded her fee.

Okay, so this is ridiculous, right? Not from a technical feasibility point of view, which I think I laid out above, but from an organizational and cultural point of view. One might say that it is a severe regulatory overreach, a violation of the freedom and the very fundamental principles of online gaming, and that the idea of community review is ludicrous on the face of it.

Well, I might have agreed — until recently, that is, when I stumbled across this. Holy Moly! What’s that? Multi-player game servers retaining state data of all players, which can be dumped to a permanent file as evidence for later peer review by a number of appointed or self-appointed judges, with crowd-sourced verdicts and suspensions or bans handed out to cheaters, and judges being rewarded or punished for good or bad judgment? And it works?

If cheating in Counter-Strike is a big enough deal to create a system like this, would it be so outrageous to apply the same basic idea to harassment in shared virtual reality, which, due to VR’s strong sense of immersion and presence, arguably has a larger negative impact on the harassed than losing a round of CS?

Discuss.

Technology Transfer

I found out today that HTC now ships a tool to measure users’ inter-pupillary distances with new Vive VR headsets. When I say “tool,” I mean a booklet with instructions in many languages, and a ruler printed along one edge of each page:

IPD measurement chart shipped by HTC with new Vives.

Figure 1: IPD measurement chart shipped by HTC with new Vives. Image courtesy of reddit user DanielDC88, image source.

I thought this was great on multiple levels. For one, measuring the user’s IPD and entering it into the VR software, either manually or through a sensor on a physical IPD adjustment knob or slider on the headset, as in both Vive and Oculus Rift, is an important component of creating convincing VR displays. The more people get used to that, the better.

On the second level, I was proud. On April 9, 2014, I wrote an article on this here blog titled “How to Measure Your IPD,” which describes this exact method of using a mirror and a ruler. It even became one of my more popular articles (the fifth most popular article, actually, with 33,952 views as of today). I was a little less proud when I looked at my own article again just now, and realized that my diagrams were absolutely hideous compared to those in HTC’s booklet. Oh well. Continue reading

Keeping VR users from hurting themselves

Just the other day, I jumped on the wayback machine and posted an article about our work in immersive tele-collaboration, featuring research (and a video) from about four years ago. The shame! I figured it would be excusable that one time, and I would never do it again. Oh well, here we go.

Keeping VR users from hurting themselves

… or their expensive VR equipment.

It’s a pretty big deal. Virtual Reality, especially its head-mounted implementation, is quite good at overriding its users’ sense of place and space. “Presence,” or the feeling of bodily being in a place where one knows to be not, is a powerful and compelling experience, but it has a downside: users experiencing it lose touch with their real physical environments. Exhibit A: Figure 1 (granted, there are some concerns that the following video clip was staged, but let’s pretend it’s for reals).

Figure 1: When instinct takes over. Source: imgur

To prevent this kind of thing from happening — at least in most cases — Valve implemented a system called “Chaperone” into the SteamVR run-time framework that runs their and HTC’s Vive VR headset (and potentially other headsets, through Valve’s OpenVR layer). Continue reading