How to Track Glowing Balls in 3D

Note: I started writing this article in June 2017, because people kept asking me about details of the PS Move tracking algorithm I implemented for the video in Figure 1. But I never finished it because I couldn’t find the time to do all the measurements needed for a thorough error analysis, and also because the derivation of the linear system at the core of the algorithm really needed some explanatory diagrams, and those take a lot of work. So the article stayed on the shelf. I’m finally publishing it today, without error analysis or diagrams, because people are still asking me about details of the algorithm, more than four years after I published the video. 🙂

This one is long overdue. Back in 2015, on September 30th to be precise, I uploaded a video showing preliminary results from a surprisingly robust optical 3D tracking algorithm I had cooked up specifically to track PS Move controllers using a standard webcam (see Figure 1).

Figure 1: A video showing my PS Move tracking algorithm, and my surprised face.

During discussion of that video, I promised to write up the algorithm I used, and to release source code. But as it sometimes happens, I didn’t do either. I was just reminded of that by an email I received from one of the PS Move API developers. So, almost two years late, here is a description of the algorithm. Given that PSVR is now being sold in stores, and that PS Move controllers are more wide-spread than ever, and given that the algorithm is interesting in its own right, it might still be useful. Continue reading

3D Camera Calibration for Mixed-Reality Recording

Mixed-reality recording, i.e., capturing a user inside of and interacting with a virtual 3D environment by embedding their real body into that virtual environment, has finally become the accepted method of demonstrating virtual reality applications through standard 2D video footage (see Figure 1 for a mixed-reality recording made in VR’s stone age). The fundamental method behind this recording technique is to create a virtual camera whose intrinsic parameters (focal length, lens distortion, …) and extrinsic parameters (position and orientation in space) exactly match those of the real camera used to film the user; to capture a virtual video stream from that virtual camera; and then to composite the virtual and real streams into a final video.

Figure 1: Ancient mixed-reality recording from inside a CAVE, captured directly on a standard video camera without any post-processing.

Continue reading

Keeping VR users from hurting themselves

Just the other day, I jumped on the wayback machine and posted an article about our work in immersive tele-collaboration, featuring research (and a video) from about four years ago. The shame! I figured it would be excusable that one time, and I would never do it again. Oh well, here we go.

Keeping VR users from hurting themselves

… or their expensive VR equipment.

It’s a pretty big deal. Virtual Reality, especially its head-mounted implementation, is quite good at overriding its users’ sense of place and space. “Presence,” or the feeling of bodily being in a place where one knows to be not, is a powerful and compelling experience, but it has a downside: users experiencing it lose touch with their real physical environments. Exhibit A: Figure 1 (granted, there are some concerns that the following video clip was staged, but let’s pretend it’s for reals).

Figure 1: When instinct takes over. Source: imgur

To prevent this kind of thing from happening — at least in most cases — Valve implemented a system called “Chaperone” into the SteamVR run-time framework that runs their and HTC’s Vive VR headset (and potentially other headsets, through Valve’s OpenVR layer). Continue reading

Lighthouse tracking examined

To my surprise and delight, I recently found out that Valve has been releasing Linux versions of most of their SteamVR/OpenVR run-time/SDK for a while now (OpenVR just hit version 1.0.0, go get it while it’s fresh). This is great news: it will allow me to port Vrui and all Vrui applications to the Vive headset and its tracked controllers in one fell swoop.

But before diving into developing a Lighthouse tracking driver plug-in for Vrui’s input device abstraction layer, I decided to cobble together a small testing utility to get a feel for OpenVR’s internal driver interface, and for the Lighthouse tracking system’s overall tracking quality.

Figure 1: The Lighthouse 6-DOF tracking system, disassembled.

Figure 1: The Lighthouse 6-DOF tracking system, disassembled (source).

Continue reading

Oculus Rift DK2’s tracking update rate

I’ve been involved in some arguments about the inner workings of the Oculus Rift’s and HTC/Valve Vive’s tracking systems recently, and while I don’t want to get into any of that right now, I just did a little experiment.

The tracking update rate of the Oculus Rift DK2, meaning the rate at which Oculus’ tracking driver sends different position/orientation estimates to VR applications, is 1000 Hz. However, the time between updates is 2ms, meaning that the driver updates the position/orientation, and then updates it again immediately afterwards, 500 times per second.

This is not surprising at all, given my earlier observation that the DK2 samples its internal IMU at a rate of 1000 Hz, and sends data packets containing 2 IMU samples each to the host at a rate of 500 Hz. The tracking driver is then kind enough to process these samples individually, and pass updated tracking data to applications after it’s done processing each one. That second part is maybe a bit superfluous, but I’ll take it.

Here is a (very short excerpt of a) dump from the test application I wrote:

0.00199484: -0.0697729, -0.109664, -0.458555
6.645e-06 : -0.0698003, -0.110708, -0.458532
0.00199313: -0.069828 , -0.111758, -0.45851
6.012e-06 : -0.0698561, -0.112813, -0.458488
0.00200075: -0.0698847, -0.113875, -0.458466
6.649e-06 : -0.0699138, -0.114943, -0.458445
0.0019885 : -0.0699434, -0.116022, -0.458427
5.915e-06 : -0.0699734, -0.117106, -0.45841
0.0020142 : -0.070004 , -0.118196, -0.458393
5.791e-06 : -0.0700351, -0.119291, -0.458377
0.00199589: -0.0700668, -0.120392, -0.458361
6.719e-06 : -0.070099 , -0.121499, -0.458345
0.00197487: -0.0701317, -0.12261 , -0.45833
6.13e-06  : -0.0701651, -0.123727, -0.458314
0.00301248: -0.0701991, -0.124849, -0.458299
5.956e-06 : -0.0702338, -0.125975, -0.458284
0.00099399: -0.0702693, -0.127107, -0.458269
5.971e-06 : -0.0703054, -0.128243, -0.458253
0.0019938 : -0.0703423, -0.129384, -0.458238
5.938e-06 : -0.0703799, -0.130529, -0.458223
0.00200243: -0.0704184, -0.131679, -0.458207
7.434e-06 : -0.0704576, -0.132833, -0.458191
0.0019831 : -0.0704966, -0.133994, -0.458179
5.957e-06 : -0.0705364, -0.135159, -0.458166
0.00199577: -0.0705771, -0.136328, -0.458154
5.974e-06 : -0.0706185, -0.137501, -0.458141

The first column is the time interval between each row and the previous row, in seconds. The second to fourth rows are the reported (x, y, z) position of the headset.

I hope this puts the myth to rest that the DK2 only updates its tracking data when it receives a new frame from the tracking camera, which is 60 times per second, and confirms that the DK2’s tracking is based on dead reckoning with drift correction. Now, while it is possible that the commercial version of the Rift does things differently, I don’t see a reason why it should.

PS: If you look closely, you’ll notice an outlier in rows 15 and 17: the first interval is 3ms, and the second interval is only 1ms. One sample missed the 1000 Hz sample clock, and was delivered on the next update.

On the road for VR: Redwood City, California

Last Friday I made a trek down to the San Francisco peninsula, to visit and chat with a couple of other VR folks: Cyberith, SVVR, and AltspaceVR. In the process, I also had the chance to try a couple of VR devices I hadn’t seen before.

Cyberith Virtualizer

Virtual locomotion, and its nasty side effect, simulator sickness, are a pretty persistent problem and timely topic with the arrival of consumer VR just around the corner. Many enthusiasts want to use VR to explore large virtual worlds, as in taking a stroll through the frozen tundra of Skyrim or the irradiated wasteland of Fallout, but as it turns out, that’s one of the hardest things to do right in VR.

Figure 1: Cyberith Virtualizer, driven by an experienced user (Tuncay Cakmak). Yes, you can jump and run, with some practice.

Continue reading

On the road for VR: Microsoft HoloLens at Build 2015, San Francisco

I have briefly mentioned HoloLens, Microsoft’s upcoming see-through Augmented Reality headset, in a previous post, but today I got the chance to try it for myself at Microsoft’s “Build 2015” developers’ conference. Before we get into the nitty-gritty, a disclosure: Microsoft invited me to attend Build 2015, meaning they waived my registration fee, and they gave me, like all other attendees, a free HP Spectre x360 notebook (from which I’m typing right now because my vintage 2008 MacBook Pro finally kicked the bucket). On the downside, I had to take Amtrak and Bart to downtown San Francisco twice, because I wasn’t able to get a one-on-one demo slot on the first day, and got today’s 10am slot after some finagling and calling in of favors. I guess that makes us even. 😛

So, on to the big question: is HoloLens real? Given Microsoft’s track record with product announcements (see 2009’s Project Natal trailer and especially the infamous Milo “demo”), there was some well-deserved skepticism regarding the HoloLens teaser released in January, and even the on-stage demo that was part of the Build 2015 keynote:

The short answer is: yes, it’s real, but… Continue reading

Hacking the Oculus Rift DK2, part IV

Link

Note: This is part 4 of a four-part series. [Part 1] [Part 2] [Part 3]

I caved and uploaded a snapshot of the current optical tracking sources, including a pre-release snapshot of upcoming Vrui-3.2-001 (please don’t use it outside of the tracking project; it’s bound to change some before it’s really released), to github: http://github.com/Doc-Ok/OpticalTracking.

Update on Vrui / Oculus Rift DK2

I’ve been getting a lot of questions about using the Rift DK2 under Linux with Vrui recently, so I figured I’d post a little progress report here instead of answering them individually.

The good news is that I have the DK2 working to the level of the DK1, i.e., I have orientational tracking, lens distortion correction, and chromatic aberration correction. I also have low persistence, but that came for free.

What I don’t have, and most probably won’t have until an official Linux SDK drops, is positional tracking. In order to replicate the work a team of computer vision experts at Oculus have been doing for the last year or so, I’d need a few clones and a time machine. That said, I am working on combining the DK1/DK2’s built-in IMU with other external tracking systems, such as Intersense IS-900 or NaturalPoint OptiTrack. That’s a much easier (but still tricky) problem, and would allow using the Rift as a headset for large-area VR. Probably not interesting for home users, but being able to walk around freely in an 18’x10’x7′ volume opens up entirely different VR applications.

I’m currently working hard on the next release of the Vrui toolkit (version 3.2-001), which will have at least the level of DK2 support that I have internally now (combined tracking might or might not make it, but that can already be faked, see 3D Video Capture With Three Kinects).

The reason why I’m not releasing right now is that I’m still trying to optimize the “user experience” by integrating the ideas I described in A Trip Down the Graphics Pipeline. The idea is that plugging in a Rift and starting a Vrui application should just work. I have most of that going; the only issue is telling OpenGL to sync to the vertical retrace on the Rift’s display, no matter what. Right now that can only be done via environment variable, and I’m looking for the right place in Vrui to set that variable from inside a program. It’s a work-around until Nvidia expose that functionality via their NV-CONTROL X extension, or, even better, via a GLX extension (are you listening, Nvidia?). Or, why not change the implementation of GLX_SGI_video_sync, which is already bound to a display and drawable, such that it always syncs to the first video controller servicing that drawable? Wouldn’t even require a specification change. Just an idea.

And last but not least, once I got the DK2 and its low-persistence screen working, I realized how cavalier I’ve been about low-level timing issues in Vrui. With screen-based VR and LCD-based HMDs it has simply never been an issue before, but now it’s pretty obvious. Good thing is, I think I have a handle on it.

In summary: it’ll be a little bit longer, but I’m on it. Will I be able to release before Oculus does their Linux SDK? Sure hope so! And just in case you think I’ve been sitting on my hands for the last six months: there are already about 300 large and small changes between 3.1-002 and 3.2-001.

And here is today’s unrelated picture:

Figure 1: New adventures in real estate speculation.

On the road for VR: Oculus Connect, Hollywood

After some initial uncertainty, and accidentally raising a stink on reddit, I did manage to attend Oculus Connect last weekend after all. I guess this is what a birthday bash looks like when the feted is backed by Facebook and gets to invite 1200 of his closest friends… and yours truly! It was nice to run into old acquaintances, meet new VR geeks, and it is still an extremely weird feeling to be approached by people who introduce themselves as “fans.” There were talks and panels, but I skipped most of those to take in demos and mingle instead; after all, I can watch a talk on YouTube from home just fine. Oh, and there was also new mobile VR hardware to check out, and a big surprise. Let’s talk VR hardware. Continue reading