3D Camera Calibration for Mixed-Reality Recording

Mixed-reality recording, i.e., capturing a user inside of and interacting with a virtual 3D environment by embedding their real body into that virtual environment, has finally become the accepted method of demonstrating virtual reality applications through standard 2D video footage (see Figure 1 for a mixed-reality recording made in VR’s stone age). The fundamental method behind this recording technique is to create a virtual camera whose intrinsic parameters (focal length, lens distortion, …) and extrinsic parameters (position and orientation in space) exactly match those of the real camera used to film the user; to capture a virtual video stream from that virtual camera; and then to composite the virtual and real streams into a final video.

Figure 1: Ancient mixed-reality recording from inside a CAVE, captured directly on a standard video camera without any post-processing.

Continue reading

Vrui on Oculus Rift DK2

I know, the Oculus Rift DK2 is obsolete equipment, but nonetheless — there are a lot of them still out there, it’s still a decent VR headset for seated applications, I guess they’re getting cheaper on eBay now, and I put in all the work back then to support it in Vrui, so I might as well describe how to use it. If nothing else, the DK2 is a good way to watch DVD movies, or panoramic mono- or stereoscopic videos, in VR.

Figure 1: Using an Oculus Rift DK2 headset with a pair of Vive controllers -- because why not?

Figure 1: Using an Oculus Rift DK2 headset with a pair of Vive controllers — because why not?

Continue reading

Vive la Vrui!

It has been way too long that I have publicly released a new version of the Vrui VR toolkit. The main issue was that I had been chasing evolving hardware, from the Oculus Rift DK1 to the Oculus Rift DK2, and now to HTC’s Vive. During that long stretch of time, I was never happy with the state of support of any of these devices.

That’s finally changed. I have been working on full native support for HTC’s Vive head-mounted display over the last few months (with the first major break-through in May), and I think it’s working really well. There are still a lot of improvements to make and sharp edges to sand off, but I feel it is worthwhile releasing the software as it is now to get some early testing done. So without much further ado, here is Vrui-4.2-004.

Figure 1: Vrui’s ClusterJello toy application running on an HTC Vive head-mounted display. Recorded using a second-generation Microsoft Kinect camera (Kinect-for-Xbox-One).
Continue reading

Keeping VR users from hurting themselves

Just the other day, I jumped on the wayback machine and posted an article about our work in immersive tele-collaboration, featuring research (and a video) from about four years ago. The shame! I figured it would be excusable that one time, and I would never do it again. Oh well, here we go.

Keeping VR users from hurting themselves

… or their expensive VR equipment.

It’s a pretty big deal. Virtual Reality, especially its head-mounted implementation, is quite good at overriding its users’ sense of place and space. “Presence,” or the feeling of bodily being in a place where one knows to be not, is a powerful and compelling experience, but it has a downside: users experiencing it lose touch with their real physical environments. Exhibit A: Figure 1 (granted, there are some concerns that the following video clip was staged, but let’s pretend it’s for reals).

Figure 1: When instinct takes over. Source: imgur

To prevent this kind of thing from happening — at least in most cases — Valve implemented a system called “Chaperone” into the SteamVR run-time framework that runs their and HTC’s Vive VR headset (and potentially other headsets, through Valve’s OpenVR layer). Continue reading

Remote Collaborative Immersive Visualization

I spent the last couple of days at the first annual meeting of “The Higher Education Campus Alliance for Advanced Visualization” (THE CAAV), where folks managing or affiliated with advanced visualization centers such as KeckCAVES came together to share their experiences. During the talks, I saw slides showing Vrui‘s Collaboration Infrastructure pop up here and there, and generally remote collaboration was a big topic of discussion. During breaks, I showed several people the following video on my smartphone (yes, I finally joined the 21st century), and afterwards realized that I had never written a post about this work, as most of it predates this blog. So here we go.

Continue reading

For Science!

I’ve been busy finalizing the upcoming 4.0 release of the Vrui VR toolkit (it looks like I will have full support for Oculus Rift DK2 just before it is obsoleted by the commercial version, haha), and needed a short break.

So I figured I’d do something I’ve never done before in VR, namely, watch a full-length theatrical movie. I’m still getting DVDs from Netflix like it’s 1999, and I had “Avengers: Age of Ultron” at hand. The only problem was that I didn’t have a VR-enabled movie player.

Well, how hard can that be? Not hard at all, as it turns out. I installed the development packages for the xine multimedia framework, browsed through their hacker’s guide, figured out where to intercept audio buffers and decoded video frames, and three hours later I had a working prototype. A few hours more, and I had a user interface, full DVD menu navigation, a scrub bar, and subtitles. In 737 lines of code, a big chunk of which is debugging output to trace the control and data flow of the xine library. So yeah, libxine is awesome.

Then it was time to pull the easy chair into the office, start VruiXine, put on the Rift, map DVD navigation controls to the handy SteelSeries Stratus XL bluetooth gamepad they were giving away at Oculus Connect2, and relax (see Figure 1).

Figure 1: The title menu of the “Avengers: Age of Ultron” DVD in a no-frills VR movie player (VruiXine). Fancy virtual environments are left as an exercise for the reader.

Continue reading

Zero-latency Rendering

I finally managed to get the Oculus Rift DK2 fully supported in my Vrui VR toolkit, and while there are still some serious issues, such as getting the lens distortion formulas and internal HMD geometry exactly right, I’ve already noticed something really neat.

I have a bunch of graphically simple applications that run at ridiculous frame rates (some get several thousand fps on an Nvidia GeForce 770 GTX), and with some new rendering configuration options in Vrui 4.0 I can disable vsync, and render directly into the display window’s front buffer. In other words, I can let these applications “race the beam.”

There are two main results of disabling vsync and rendering into the front buffer: For one, the CPU and graphics card get really hot (so this is not something you want to do this naively). But second, let’s assume that some application can render 1,000 fps. This means, every millisecond, a new complete video frame is rendered into video scan-out memory, where it gets picked up by the video controller and sent across the video link immediately. In other words, almost every line of the Rift’s display gets a “fresh” image, based on most up-to-date tracking data, and flashes this image to the user’s retina without further delay. Or in other words, total motion-to-photon latency for the entire screen is now down to around 1ms. And the result of that is by far the most solid VR I’ve ever seen.

Not entirely useful, but pretty cool nonetheless.

Messing around with 3D video

We had a couple of visitors from Intel this morning, who wanted to see how we use the CAVE to visualize and analyze Big Datatm. But I also wanted to show them some aspects of our 3D video / remote collaboration / tele-presence work, and since I had just recently implemented a new multi-camera calibration procedure for depth cameras (more on that in a future post), and the alignment between the three Kinects in the IDAV VR lab’s capture space is now better than it has ever been (including my previous 3D Video Capture With Three Kinects video), I figured I’d try something I hadn”t done before, namely remotely interacting with myself (see Figure 1).

Figure 1: How to properly pat yourself on the back using time-delayed 3D video.

Continue reading

Fighting black smear

Now that I’ve gotten my Oculus Rift DK2 (mostly) working with Vrui under Linux, I’ve encountered the dreaded artifact often referred to as “black smear.” While pixels on OLED screens have very fast switching times — orders of magnitude faster than LCD pixels — they still can’t switch from on to off and back instantaneously. This leads to a problem that’s hardly visible when viewing a normal screen, but very visible in a head-mounted display due to a phenomenon called “vestibulo-ocular reflex.”

Basically, our eyes have built-in image stabilizers: if we move our head, this motion is detected by the vestibular apparatus in the inner ear (our “sense of equilibrium”), and our eyes automatically move the opposite way to keep our gaze fixed on a fixed point in space (interestingly, this even happens with the eyes closed, or in total darkness).

Figure 1: Black smear. It’s kinda like that.

Continue reading

Update on Vrui / Oculus Rift DK2

I’ve been getting a lot of questions about using the Rift DK2 under Linux with Vrui recently, so I figured I’d post a little progress report here instead of answering them individually.

The good news is that I have the DK2 working to the level of the DK1, i.e., I have orientational tracking, lens distortion correction, and chromatic aberration correction. I also have low persistence, but that came for free.

What I don’t have, and most probably won’t have until an official Linux SDK drops, is positional tracking. In order to replicate the work a team of computer vision experts at Oculus have been doing for the last year or so, I’d need a few clones and a time machine. That said, I am working on combining the DK1/DK2’s built-in IMU with other external tracking systems, such as Intersense IS-900 or NaturalPoint OptiTrack. That’s a much easier (but still tricky) problem, and would allow using the Rift as a headset for large-area VR. Probably not interesting for home users, but being able to walk around freely in an 18’x10’x7′ volume opens up entirely different VR applications.

I’m currently working hard on the next release of the Vrui toolkit (version 3.2-001), which will have at least the level of DK2 support that I have internally now (combined tracking might or might not make it, but that can already be faked, see 3D Video Capture With Three Kinects).

The reason why I’m not releasing right now is that I’m still trying to optimize the “user experience” by integrating the ideas I described in A Trip Down the Graphics Pipeline. The idea is that plugging in a Rift and starting a Vrui application should just work. I have most of that going; the only issue is telling OpenGL to sync to the vertical retrace on the Rift’s display, no matter what. Right now that can only be done via environment variable, and I’m looking for the right place in Vrui to set that variable from inside a program. It’s a work-around until Nvidia expose that functionality via their NV-CONTROL X extension, or, even better, via a GLX extension (are you listening, Nvidia?). Or, why not change the implementation of GLX_SGI_video_sync, which is already bound to a display and drawable, such that it always syncs to the first video controller servicing that drawable? Wouldn’t even require a specification change. Just an idea.

And last but not least, once I got the DK2 and its low-persistence screen working, I realized how cavalier I’ve been about low-level timing issues in Vrui. With screen-based VR and LCD-based HMDs it has simply never been an issue before, but now it’s pretty obvious. Good thing is, I think I have a handle on it.

In summary: it’ll be a little bit longer, but I’m on it. Will I be able to release before Oculus does their Linux SDK? Sure hope so! And just in case you think I’ve been sitting on my hands for the last six months: there are already about 300 large and small changes between 3.1-002 and 3.2-001.

And here is today’s unrelated picture:

Figure 1: New adventures in real estate speculation.