About okreylos

I am a research computer scientist at the University of California, Davis. My research areas are scientific visualization, particularly in immersive ("virtual reality") environments, human/computer interaction in immersive environments, and 3D computer graphics. My primary work is software development, from architecture over design to implementation and coding. I am the primary software developer for the UC Davis W.M. Keck Center for Active Visualization in the Earth Sciences (KeckCAVES). Some of my released packages are Vrui (a VR development toolkit), CollaborationInfrastructure (a tele-collaboration plug-in for Vrui), Kinect (a driver suite to capture 3D video from Microsoft Kinect cameras), LiDAR Viewer (a visualization package for very large 3D point clouds), 3D Visualizer (a system for interactive visual analysis of 3D volumetric data), Nanotech Construction Kit (an interactive molecular design program), and SARndbox (an augmented reality sandbox). I also dabble in VR hardware, in the sense that I take existing custom or commodity hardware components (3D TVs, head-mounted displays, projectors, tracking systems, Wiimotes, Kinect cameras, ...) and build fully integrated immersive environments out of them. This includes a fair share of driver development to support hardware that either doesn't have drivers, or whose vendor-supplied drivers are not up to par.

Accommodation and Vergence in Head-mounted Displays

Why do virtual objects close to my face appear blurry when wearing a VR headset? My vision is fine!

And why does the real world look strange immediately after a long VR session?

These are another two (relates ones) of those frequently-asked questions about VR and head-mounted displays (HMDs) that I promised to address a while back.

Here’s the short answer: In all currently-available HMDs, the screens creating the virtual imagery are at a fixed optical distance from the user. But our eyes have evolved to automatically adjust their optical focus based on the perceived distance to objects, virtual or real, that they are looking at. So when a virtual object appears to be mere inches in front of the user’s face, but the screens showing images of that object are — optically — several meters away, the user’s eyes will focus on the wrong distance, and as a result, the virtual object will appear blurry (the same happens, albeit less pronounced, when a virtual object appears to be very far away). This effect is called accommodation-vergence conflict, and besides being a nuisance, it can also cause eye strain or headaches during prolonged VR sessions, and can cause vision problems for a short while after such sessions.

Now for the long answer.

Continue reading

VR medical visualization with 3D Visualizer

Now that Vrui is working on the HTC Vive (at least until the next SteamVR update breaks ABI again), I can finally go back and give Vrui-based applications some tender loving care. First up is 3D Visualizer, an application to visualize and, more importantly, visually analyze three-dimensional volumetric data sets (see Figure 1).

Figure 1: Analyzing a CAT scan with 3D Visualizer on the HTC Vive. Cat included.

Continue reading

A HoloArticle

Here is an update on my quest to stay on top of all things “holo:” HoloLamp and RealView “Live Holography.” While the two have really nothing to do with each other, both claim the “holo” label with varying degrees of legitimacy, and happened to pop up recently.

HoloLamp

At its core, HoloLamp is a projection mapping system somewhat similar to the AR Sandbox, i.e., a combination of a set of cameras scanning a projection surface and a viewer’s face, and a projector drawing a perspective-correct image, from the viewer’s point of view, onto said projection surface. The point of HoloLamp is to project images of virtual 3D objects onto arbitrary surfaces, to achieve effects like the Millenium Falcon’s holographic chess board in Star Wars: A New Hope. Let’s see how it works, and how it falls short of this goal.

Creating convincing virtual three-dimensional objects via projection is a core technology of virtual reality, specifically the technology that is driving CAVEs and other screen-based VR displays. To create this illusion, a display system needs to know two things: the exact position of the projection surface in 3D space, and the position of the viewer’s eyes in the same 3D space. Together, these two provide just the information needed to set up the correct perspective projection. In CAVEs et al., the position of the screen(s) is fixed and precisely measured during installation, and the viewer’s eye positions are provided via real-time head tracking.

As one goal of HoloLamp is portability, it cannot rely on pre-installation and manual calibration. Instead, HoloLamp scans and creates a 3D model of the projection surface when turned on (or asked to do so, I guess). It does this by projecting a sequence of patterns, and observing the perspective distortion of those patterns with a camera looking in the projection direction. This is a solid and well-known technology called structured-light 3D scanning, and can be seen in action at the beginning of this HoloLamp video clip. To extract eye positions, HoloLamp uses an additional set of cameras looking upwards to identify and track the viewer’s face, probably using off-the-shelf face tracking algorithms such as the Viola-Jones filter. Based on that, the software can project 3D objects using one or more projection matrices, depending on whether the projection surface is planar or not. The result looks very convincing when shot through a regular video camera:

Continue reading

Vrui on Oculus Rift DK2

I know, the Oculus Rift DK2 is obsolete equipment, but nonetheless — there are a lot of them still out there, it’s still a decent VR headset for seated applications, I guess they’re getting cheaper on eBay now, and I put in all the work back then to support it in Vrui, so I might as well describe how to use it. If nothing else, the DK2 is a good way to watch DVD movies, or panoramic mono- or stereoscopic videos, in VR.

Figure 1: Using an Oculus Rift DK2 headset with a pair of Vive controllers -- because why not?

Figure 1: Using an Oculus Rift DK2 headset with a pair of Vive controllers — because why not?

Continue reading

Boom, Headshot

There was an article on Medium yesterday: My First Virtual Reality Groping. In it, a first-time VR/HTC Vive user describes how she was virtually groped by another player inside an online multi-player VR game, within the first three minutes of her first such endeavor, and how it ruined her experience and deeply disturbed her.

I do not know what to call player “BigBro442’s” behavior, but I do know that it is highly inappropriate, and toxic for VR as a whole. This, people, is why we can’t have nice things. This is by far not the first instance of virtual harassment or VR griefing that I’ve heard of, but it’s the one that got me thinking because of this comment on the article:

This is reality. The best we can do is educate, starting with articles like this.

No. That is not true. We can do better than that. Unlike reality, where someone might be assaulted inside their own home, or in some dark back alley, with no witnesses around or evidence left behind, this is virtual reality, which only exists as a sequence of ones and zeros on some Internet server. That server has absolute knowledge of anything that goes on anywhere inside the virtual world it maintains, like an omniscient Big Brother. If virtual harassment happens in virtual reality, maybe virtual reality needs virtual courts.

Here is a not-so-modest proposal, off the top of my head, using SteamVR/Steam as example platforms:

  • Any server maintaining a virtual world potentially used by more than one person at the same time keeps a ring buffer of each connected user’s avatar state for the last, say, five minutes. That’s not overly demanding: sampling a head tracker and two hand trackers at, say, 30 Hz, over five minutes, results in approx. 750kB of data total, per user.
  • The client user interface of any shared virtual environment contains a button in some easily accessible standard place, say in SteamVR’s overlay, to file a harassment complaint.
  • If a user (“Alice”) files a complaint, several things happen. Most importantly, the server immediately dumps the avatar state ring buffers of all connected (or recently connected) players to a file. Second, Alice is immediately charged a small fee, say $5, on the credit card associated with her Steam account. This is a micro-transaction, an existing Steam feature. The fee’s purpose is to discourage another form of harassment, namely filing frivolous complaints against innocent users.
  • Files generated by complaints, with personally identifying information redacted, will be reviewed by a peer group of humans. This might be done by appointed moderators, or might even be crowd-sourced.
  • If review determines that behavior contained in the 5-minute replay violates community standards, Alice will be refunded the fee she was charged, and offending user Bob’s Steam account will be temporarily suspended, say for one day on the first offense, starting either immediately or the next time Bob attempts to log in. And I mean Bob’s entire Steam account is suspended, not just his access to one particular server or shared VR application: Bob’s on time-out and can go read a book.
  • If review determines that the complaint was without merit, nothing happens to accused user Bob, and Alice is not refunded her fee. If Alice disagrees, she can raise the stakes by re-filing the same complaint for another $5 fee, the total $10 then being refundable or not, etc.
  • If review cannot reach agreement, or review does not happen within a reasonable time frame, Alice is refunded her fee.

Okay, so this is ridiculous, right? Not from a technical feasibility point of view, which I think I laid out above, but from an organizational and cultural point of view. One might say that it is a severe regulatory overreach, a violation of the freedom and the very fundamental principles of online gaming, and that the idea of community review is ludicrous on the face of it.

Well, I might have agreed — until recently, that is, when I stumbled across this. Holy Moly! What’s that? Multi-player game servers retaining state data of all players, which can be dumped to a permanent file as evidence for later peer review by a number of appointed or self-appointed judges, with crowd-sourced verdicts and suspensions or bans handed out to cheaters, and judges being rewarded or punished for good or bad judgment? And it works?

If cheating in Counter-Strike is a big enough deal to create a system like this, would it be so outrageous to apply the same basic idea to harassment in shared virtual reality, which, due to VR’s strong sense of immersion and presence, arguably has a larger negative impact on the harassed than losing a round of CS?

Discuss.

Technology Transfer

I found out today that HTC now ships a tool to measure users’ inter-pupillary distances with new Vive VR headsets. When I say “tool,” I mean a booklet with instructions in many languages, and a ruler printed along one edge of each page:

IPD measurement chart shipped by HTC with new Vives.

Figure 1: IPD measurement chart shipped by HTC with new Vives. Image courtesy of reddit user DanielDC88, image source.

I thought this was great on multiple levels. For one, measuring the user’s IPD and entering it into the VR software, either manually or through a sensor on a physical IPD adjustment knob or slider on the headset, as in both Vive and Oculus Rift, is an important component of creating convincing VR displays. The more people get used to that, the better.

On the second level, I was proud. On April 9, 2014, I wrote an article on this here blog titled “How to Measure Your IPD,” which describes this exact method of using a mirror and a ruler. It even became one of my more popular articles (the fifth most popular article, actually, with 33,952 views as of today). I was a little less proud when I looked at my own article again just now, and realized that my diagrams were absolutely hideous compared to those in HTC’s booklet. Oh well. Continue reading

Vive la Vrui!

It has been way too long that I have publicly released a new version of the Vrui VR toolkit. The main issue was that I had been chasing evolving hardware, from the Oculus Rift DK1 to the Oculus Rift DK2, and now to HTC’s Vive. During that long stretch of time, I was never happy with the state of support of any of these devices.

That’s finally changed. I have been working on full native support for HTC’s Vive head-mounted display over the last few months (with the first major break-through in May), and I think it’s working really well. There are still a lot of improvements to make and sharp edges to sand off, but I feel it is worthwhile releasing the software as it is now to get some early testing done. So without much further ado, here is Vrui-4.2-004.

Figure 1: Vrui’s ClusterJello toy application running on an HTC Vive head-mounted display. Recorded using a second-generation Microsoft Kinect camera (Kinect-for-Xbox-One).
Continue reading

Keeping VR users from hurting themselves

Just the other day, I jumped on the wayback machine and posted an article about our work in immersive tele-collaboration, featuring research (and a video) from about four years ago. The shame! I figured it would be excusable that one time, and I would never do it again. Oh well, here we go.

Keeping VR users from hurting themselves

… or their expensive VR equipment.

It’s a pretty big deal. Virtual Reality, especially its head-mounted implementation, is quite good at overriding its users’ sense of place and space. “Presence,” or the feeling of bodily being in a place where one knows to be not, is a powerful and compelling experience, but it has a downside: users experiencing it lose touch with their real physical environments. Exhibit A: Figure 1 (granted, there are some concerns that the following video clip was staged, but let’s pretend it’s for reals).

Figure 1: When instinct takes over. Source: imgur

To prevent this kind of thing from happening — at least in most cases — Valve implemented a system called “Chaperone” into the SteamVR run-time framework that runs their and HTC’s Vive VR headset (and potentially other headsets, through Valve’s OpenVR layer). Continue reading

Remote Collaborative Immersive Visualization

I spent the last couple of days at the first annual meeting of “The Higher Education Campus Alliance for Advanced Visualization” (THE CAAV), where folks managing or affiliated with advanced visualization centers such as KeckCAVES came together to share their experiences. During the talks, I saw slides showing Vrui‘s Collaboration Infrastructure pop up here and there, and generally remote collaboration was a big topic of discussion. During breaks, I showed several people the following video on my smartphone (yes, I finally joined the 21st century), and afterwards realized that I had never written a post about this work, as most of it predates this blog. So here we go.

Continue reading

Lighthouse tracking examined

To my surprise and delight, I recently found out that Valve has been releasing Linux versions of most of their SteamVR/OpenVR run-time/SDK for a while now (OpenVR just hit version 1.0.0, go get it while it’s fresh). This is great news: it will allow me to port Vrui and all Vrui applications to the Vive headset and its tracked controllers in one fell swoop.

But before diving into developing a Lighthouse tracking driver plug-in for Vrui’s input device abstraction layer, I decided to cobble together a small testing utility to get a feel for OpenVR’s internal driver interface, and for the Lighthouse tracking system’s overall tracking quality.

Figure 1: The Lighthouse 6-DOF tracking system, disassembled.

Figure 1: The Lighthouse 6-DOF tracking system, disassembled (source).

Continue reading