Vrui on Oculus Rift DK2

I know, the Oculus Rift DK2 is obsolete equipment, but nonetheless — there are a lot of them still out there, it’s still a decent VR headset for seated applications, I guess they’re getting cheaper on eBay now, and I put in all the work back then to support it in Vrui, so I might as well describe how to use it. If nothing else, the DK2 is a good way to watch DVD movies, or panoramic mono- or stereoscopic videos, in VR.

Figure 1: Using an Oculus Rift DK2 headset with a pair of Vive controllers -- because why not?

Figure 1: Using an Oculus Rift DK2 headset with a pair of Vive controllers — because why not?

Continue reading

Boom, Headshot

There was an article on Medium yesterday: My First Virtual Reality Groping. In it, a first-time VR/HTC Vive user describes how she was virtually groped by another player inside an online multi-player VR game, within the first three minutes of her first such endeavor, and how it ruined her experience and deeply disturbed her.

I do not know what to call player “BigBro442’s” behavior, but I do know that it is highly inappropriate, and toxic for VR as a whole. This, people, is why we can’t have nice things. This is by far not the first instance of virtual harassment or VR griefing that I’ve heard of, but it’s the one that got me thinking because of this comment on the article:

This is reality. The best we can do is educate, starting with articles like this.

No. That is not true. We can do better than that. Unlike reality, where someone might be assaulted inside their own home, or in some dark back alley, with no witnesses around or evidence left behind, this is virtual reality, which only exists as a sequence of ones and zeros on some Internet server. That server has absolute knowledge of anything that goes on anywhere inside the virtual world it maintains, like an omniscient Big Brother. If virtual harassment happens in virtual reality, maybe virtual reality needs virtual courts.

Here is a not-so-modest proposal, off the top of my head, using SteamVR/Steam as example platforms:

  • Any server maintaining a virtual world potentially used by more than one person at the same time keeps a ring buffer of each connected user’s avatar state for the last, say, five minutes. That’s not overly demanding: sampling a head tracker and two hand trackers at, say, 30 Hz, over five minutes, results in approx. 750kB of data total, per user.
  • The client user interface of any shared virtual environment contains a button in some easily accessible standard place, say in SteamVR’s overlay, to file a harassment complaint.
  • If a user (“Alice”) files a complaint, several things happen. Most importantly, the server immediately dumps the avatar state ring buffers of all connected (or recently connected) players to a file. Second, Alice is immediately charged a small fee, say $5, on the credit card associated with her Steam account. This is a micro-transaction, an existing Steam feature. The fee’s purpose is to discourage another form of harassment, namely filing frivolous complaints against innocent users.
  • Files generated by complaints, with personally identifying information redacted, will be reviewed by a peer group of humans. This might be done by appointed moderators, or might even be crowd-sourced.
  • If review determines that behavior contained in the 5-minute replay violates community standards, Alice will be refunded the fee she was charged, and offending user Bob’s Steam account will be temporarily suspended, say for one day on the first offense, starting either immediately or the next time Bob attempts to log in. And I mean Bob’s entire Steam account is suspended, not just his access to one particular server or shared VR application: Bob’s on time-out and can go read a book.
  • If review determines that the complaint was without merit, nothing happens to accused user Bob, and Alice is not refunded her fee. If Alice disagrees, she can raise the stakes by re-filing the same complaint for another $5 fee, the total $10 then being refundable or not, etc.
  • If review cannot reach agreement, or review does not happen within a reasonable time frame, Alice is refunded her fee.

Okay, so this is ridiculous, right? Not from a technical feasibility point of view, which I think I laid out above, but from an organizational and cultural point of view. One might say that it is a severe regulatory overreach, a violation of the freedom and the very fundamental principles of online gaming, and that the idea of community review is ludicrous on the face of it.

Well, I might have agreed — until recently, that is, when I stumbled across this. Holy Moly! What’s that? Multi-player game servers retaining state data of all players, which can be dumped to a permanent file as evidence for later peer review by a number of appointed or self-appointed judges, with crowd-sourced verdicts and suspensions or bans handed out to cheaters, and judges being rewarded or punished for good or bad judgment? And it works?

If cheating in Counter-Strike is a big enough deal to create a system like this, would it be so outrageous to apply the same basic idea to harassment in shared virtual reality, which, due to VR’s strong sense of immersion and presence, arguably has a larger negative impact on the harassed than losing a round of CS?

Discuss.

Technology Transfer

I found out today that HTC now ships a tool to measure users’ inter-pupillary distances with new Vive VR headsets. When I say “tool,” I mean a booklet with instructions in many languages, and a ruler printed along one edge of each page:

IPD measurement chart shipped by HTC with new Vives.

Figure 1: IPD measurement chart shipped by HTC with new Vives. Image courtesy of reddit user DanielDC88, image source.

I thought this was great on multiple levels. For one, measuring the user’s IPD and entering it into the VR software, either manually or through a sensor on a physical IPD adjustment knob or slider on the headset, as in both Vive and Oculus Rift, is an important component of creating convincing VR displays. The more people get used to that, the better.

On the second level, I was proud. On April 9, 2014, I wrote an article on this here blog titled “How to Measure Your IPD,” which describes this exact method of using a mirror and a ruler. It even became one of my more popular articles (the fifth most popular article, actually, with 33,952 views as of today). I was a little less proud when I looked at my own article again just now, and realized that my diagrams were absolutely hideous compared to those in HTC’s booklet. Oh well. Continue reading

Vive la Vrui!

It has been way too long that I have publicly released a new version of the Vrui VR toolkit. The main issue was that I had been chasing evolving hardware, from the Oculus Rift DK1 to the Oculus Rift DK2, and now to HTC’s Vive. During that long stretch of time, I was never happy with the state of support of any of these devices.

That’s finally changed. I have been working on full native support for HTC’s Vive head-mounted display over the last few months (with the first major break-through in May), and I think it’s working really well. There are still a lot of improvements to make and sharp edges to sand off, but I feel it is worthwhile releasing the software as it is now to get some early testing done. So without much further ado, here is Vrui-4.2-004.

Figure 1: Vrui’s ClusterJello toy application running on an HTC Vive head-mounted display. Recorded using a second-generation Microsoft Kinect camera (Kinect-for-Xbox-One).
Continue reading

Keeping VR users from hurting themselves

Just the other day, I jumped on the wayback machine and posted an article about our work in immersive tele-collaboration, featuring research (and a video) from about four years ago. The shame! I figured it would be excusable that one time, and I would never do it again. Oh well, here we go.

Keeping VR users from hurting themselves

… or their expensive VR equipment.

It’s a pretty big deal. Virtual Reality, especially its head-mounted implementation, is quite good at overriding its users’ sense of place and space. “Presence,” or the feeling of bodily being in a place where one knows to be not, is a powerful and compelling experience, but it has a downside: users experiencing it lose touch with their real physical environments. Exhibit A: Figure 1 (granted, there are some concerns that the following video clip was staged, but let’s pretend it’s for reals).

Figure 1: When instinct takes over. Source: imgur

To prevent this kind of thing from happening — at least in most cases — Valve implemented a system called “Chaperone” into the SteamVR run-time framework that runs their and HTC’s Vive VR headset (and potentially other headsets, through Valve’s OpenVR layer). Continue reading

Remote Collaborative Immersive Visualization

I spent the last couple of days at the first annual meeting of “The Higher Education Campus Alliance for Advanced Visualization” (THE CAAV), where folks managing or affiliated with advanced visualization centers such as KeckCAVES came together to share their experiences. During the talks, I saw slides showing Vrui‘s Collaboration Infrastructure pop up here and there, and generally remote collaboration was a big topic of discussion. During breaks, I showed several people the following video on my smartphone (yes, I finally joined the 21st century), and afterwards realized that I had never written a post about this work, as most of it predates this blog. So here we go.

Continue reading

Lighthouse tracking examined

To my surprise and delight, I recently found out that Valve has been releasing Linux versions of most of their SteamVR/OpenVR run-time/SDK for a while now (OpenVR just hit version 1.0.0, go get it while it’s fresh). This is great news: it will allow me to port Vrui and all Vrui applications to the Vive headset and its tracked controllers in one fell swoop.

But before diving into developing a Lighthouse tracking driver plug-in for Vrui’s input device abstraction layer, I decided to cobble together a small testing utility to get a feel for OpenVR’s internal driver interface, and for the Lighthouse tracking system’s overall tracking quality.

Figure 1: The Lighthouse 6-DOF tracking system, disassembled.

Figure 1: The Lighthouse 6-DOF tracking system, disassembled (source).

Continue reading

On the road for VR: The White House, Washington DC

Through a complex chain of circumstances, we got ourselves invited to demonstrate the Augmented Reality Sandbox at the White House Water Summit on March 22, coinciding with the United Nations’ World Water Day 2016, as part of the National Science Foundation‘s presence (NSF funded initial development of the AR Sandbox through an Informal Science Education grant).

Figure 1: Mark I standard-issue AR Sandbox in the Indian Treaty Room in the Eisenhower Executive Office Building, pre-exhibition.

Figure 1: Mark I standard-issue AR Sandbox in the Indian Treaty Room in the Eisenhower Executive Office Building, pre-exhibition.

Continue reading

Optical Properties of Current VR HMDs

With the first commercial version of the Oculus Rift (Rift CV1) now trickling out of warehouses, and Rift DK2, HTC Vive DK1, and Vive Pre already being in developers’ hands, it’s time for a more detailed comparison between these head-mounted displays (HMDs). In this article, I will look at these HMDs’ lenses and optics in the most objective way I can, using a calibrated fish-eye camera (see Figures 1, 2, and 3).

Figure 1: Picture from a fisheye camera, showing a checkerboard calibration target displayed on a 30" LCD monitor.

Figure 1: Picture from a fisheye camera, showing a checkerboard calibration target displayed on a 30″ LCD monitor.

Figure 2: Same picture as Figure 1, after rectification. The purple lines were drawn into the picture by hand to show the picture's linearity after rectification.

Figure 2: Same picture as Figure 1, after rectification. The purple lines were drawn into the picture by hand to show the picture’s linearity after rectification.

Figure 3: Rectified picture from Figure 2, re-projected into stereographic projection to simplify measuring angles.

Figure 3: Rectified picture from Figure 2, re-projected into stereographic projection to simplify measuring angles. Concentric purple circles indicate 5-degree increments away from the projection center point.

Continue reading

Oculus Rift DK2’s tracking update rate

I’ve been involved in some arguments about the inner workings of the Oculus Rift’s and HTC/Valve Vive’s tracking systems recently, and while I don’t want to get into any of that right now, I just did a little experiment.

The tracking update rate of the Oculus Rift DK2, meaning the rate at which Oculus’ tracking driver sends different position/orientation estimates to VR applications, is 1000 Hz. However, the time between updates is 2ms, meaning that the driver updates the position/orientation, and then updates it again immediately afterwards, 500 times per second.

This is not surprising at all, given my earlier observation that the DK2 samples its internal IMU at a rate of 1000 Hz, and sends data packets containing 2 IMU samples each to the host at a rate of 500 Hz. The tracking driver is then kind enough to process these samples individually, and pass updated tracking data to applications after it’s done processing each one. That second part is maybe a bit superfluous, but I’ll take it.

Here is a (very short excerpt of a) dump from the test application I wrote:

0.00199484: -0.0697729, -0.109664, -0.458555
6.645e-06 : -0.0698003, -0.110708, -0.458532
0.00199313: -0.069828 , -0.111758, -0.45851
6.012e-06 : -0.0698561, -0.112813, -0.458488
0.00200075: -0.0698847, -0.113875, -0.458466
6.649e-06 : -0.0699138, -0.114943, -0.458445
0.0019885 : -0.0699434, -0.116022, -0.458427
5.915e-06 : -0.0699734, -0.117106, -0.45841
0.0020142 : -0.070004 , -0.118196, -0.458393
5.791e-06 : -0.0700351, -0.119291, -0.458377
0.00199589: -0.0700668, -0.120392, -0.458361
6.719e-06 : -0.070099 , -0.121499, -0.458345
0.00197487: -0.0701317, -0.12261 , -0.45833
6.13e-06  : -0.0701651, -0.123727, -0.458314
0.00301248: -0.0701991, -0.124849, -0.458299
5.956e-06 : -0.0702338, -0.125975, -0.458284
0.00099399: -0.0702693, -0.127107, -0.458269
5.971e-06 : -0.0703054, -0.128243, -0.458253
0.0019938 : -0.0703423, -0.129384, -0.458238
5.938e-06 : -0.0703799, -0.130529, -0.458223
0.00200243: -0.0704184, -0.131679, -0.458207
7.434e-06 : -0.0704576, -0.132833, -0.458191
0.0019831 : -0.0704966, -0.133994, -0.458179
5.957e-06 : -0.0705364, -0.135159, -0.458166
0.00199577: -0.0705771, -0.136328, -0.458154
5.974e-06 : -0.0706185, -0.137501, -0.458141

The first column is the time interval between each row and the previous row, in seconds. The second to fourth rows are the reported (x, y, z) position of the headset.

I hope this puts the myth to rest that the DK2 only updates its tracking data when it receives a new frame from the tracking camera, which is 60 times per second, and confirms that the DK2’s tracking is based on dead reckoning with drift correction. Now, while it is possible that the commercial version of the Rift does things differently, I don’t see a reason why it should.

PS: If you look closely, you’ll notice an outlier in rows 15 and 17: the first interval is 3ms, and the second interval is only 1ms. One sample missed the 1000 Hz sample clock, and was delivered on the next update.