About okreylos

I am a research computer scientist at the University of California, Davis. My research areas are scientific visualization, particularly in immersive ("virtual reality") environments, human/computer interaction in immersive environments, and 3D computer graphics. My primary work is software development, from architecture over design to implementation and coding. I am the primary software developer for the UC Davis W.M. Keck Center for Active Visualization in the Earth Sciences (KeckCAVES). Some of my released packages are Vrui (a VR development toolkit), CollaborationInfrastructure (a tele-collaboration plug-in for Vrui), Kinect (a driver suite to capture 3D video from Microsoft Kinect cameras), LiDAR Viewer (a visualization package for very large 3D point clouds), 3D Visualizer (a system for interactive visual analysis of 3D volumetric data), Nanotech Construction Kit (an interactive molecular design program), and SARndbox (an augmented reality sandbox). I also dabble in VR hardware, in the sense that I take existing custom or commodity hardware components (3D TVs, head-mounted displays, projectors, tracking systems, Wiimotes, Kinect cameras, ...) and build fully integrated immersive environments out of them. This includes a fair share of driver development to support hardware that either doesn't have drivers, or whose vendor-supplied drivers are not up to par.

On the road for VR part II: Tahoe Environmental Research Center, Incline Village, Lake Tahoe

We have been collaborating with the UC Davis Tahoe Environmental Research Center (TERC) for a long time. Back in — I think — 2006, we helped them purchase a large-screen stereoscopic projection system for the Otellini 3-D Visualization Theater and installed a set of Vrui-based KeckCAVES visualization applications for guided virtual tours of Lake Tahoe and the entire Earth. We have since worked on joint projects, primarily related to informal science education. Currently, TERC is one of the collaborators in the 3D lake science informal science education grant that spawned the Augmented Reality Sandbox.

The original stereo projection system, driven by a 2006 Mac Pro, was getting long in the tooth, and in the process of upgrading to higher-resolution and brighter projectors, we finally convinced the powers-that-be to get a top-of-the line Linux PC instead of yet another Mac (for significant savings, one might add). While the Ubuntu OS and Vrui application set had already been pre-installed by KeckCAVES staff in the home office, I still had to go up to the lake to configure the operating system and Vrui to render to the new projectors, update all Vrui software, align the projectors, and train the local docents in using Linux and the new Vrui application versions.

Continue reading

Meet the LiDAR Viewer

I’ve recently realized that I should urgently write about LiDAR Viewer, a Vrui-based interactive visualization application for massive-scale LiDAR (Light Detection and Ranging, essentially 3D laser scanning writ large) data.

Figure 1: Photo of a user viewing, and extracting features from, an aerial LiDAR scan of the Cosumnes River area in central California in a CAVE.

I’ve also realized, after going to the ILMF ’13 meeting, that I need to make a new video about LiDAR Viewer, demonstrating the rendering capabilities of the current and upcoming versions. This occurred to me when the movie I showed during my talk had a copyright notice from 2006(!) on it.

Continue reading

Immersive visualization for archaeological site preservation

Have I mentioned lately that VR is not dead yet, and instead thinks it’ll be going for a walk? Here’s more proof. One of KeckCAVES‘ external users, Marshall Millett, archaeologist and GIS expert, is using high-resolution 3D scanning, based on LiDAR or white-light scanning, to capture and digitally preserve cultural heritage sites, such as the Maidu Indian Museum’s historic site and trail (close to Roseville, CA).

Figure 1: Danny Rey, Tribal Historic Preservation Officer, and Marcos Guerrero, Cultural Resources Manager, representatives of the United Auburn Indian Community, viewing a high-resolution 3D scan of the Maidu Historic Trail and Site in the KeckCAVES immersive visualization facility. In the background Joe Dumit of UC Davis’ Science and Technology Studies, and myself. Photo provided by Marshall Millett.

Marshall has been using KeckCAVES software, particularly LiDAR Viewer (about which I should really write a post), and also the KeckCAVES facility itself and related technology, to visualize his high-resolution 3D models at 1:1 scale, and with the ability to experience them in ways that are not normally possible (most of these sites are fragile and/or sacred, and not available to the public). Part of this work were several visits of community representatives to the KeckCAVES facility, to view their digitally reconstructed historic site (see Figure 1).

Marshall presented a poster about his work at last year’s 3D Digital Documentation Summit, held July 10-12, 2012 at the Presidio, San Fransisco, CA, and was just interviewed for a podcast by the National Center for Preservation Technology and Training (where, as of 02/21/2013, KeckCAVES prominently features on the front page).

First VR environment in Estonia powered by Vrui

Now here’s some good news: I mentioned recently that reports of VR’s death are greatly exaggerated, and now I am happy to announce that researchers with the Institute of Cybernetics at Tallinn University of Technology have constructed the country’s first immersive display system, and I’m prowd to say it’s powered by the Vrui toolkit. The three-screen, back-projected display was entirely designed and built in-house. Its main designers, PhD student Emiliano Pastorelli and his advisor Heiko Herrmann, kindly sent several diagrams and pictures, see Figures 1, 2, 3, and 4.

Figure 1: Engineering diagram of Tallinn University of Technology’s new VR display, provided by Emiliano Pastorelli.

Continue reading

AR Sandbox news

The first “professionally built” AR sandbox, whose physical setup was designed and built by the fine folks from the San Francisco Exploratorium, arrived at its new home at ECHO Lake Aquarium and Science Center.

Figure 1: Picture of ECHO Lake Aquarium and Science Center’s Augmented Reality Sandbox during installation on the exhibit floor. Note the portrait orientation of the sand table with respect to the back panel, the projector tilt to make up for it, and the high placement of the Kinect camera (visible at the very top of the picture). Photo provided by Travis Cook, ECHO.

Continue reading

On the road for VR (sort of…): ILMF ’13, Denver, CO

I just returned from the 2013 International LiDAR Mapping Forum (ILMF ’13), where I gave a talk about LiDAR Viewer (which I haven’t previously written about here, but I really should). ILMF is primarily an event for industry exhibitors and LiDAR users from government agencies or private companies to meet. I only saw one other person from the academic LiDAR community there, and my talk stuck out like a sore thumb, too (see Figure 1).

Figure 1: Snapshot from towards the end of my talk at ILMF ’13, kindly provided by Marshall Millett. My talk was a bit off-topic for the rest of the conference, and at 8:30 in the morning, hopefully explaining the sparse audience.

Continue reading

… and they did!

build their own augmented reality sandboxes, that is.

We still haven’t installed the three follow-up AR sandboxes at the participating institutions of our informal science education NSF projectTahoe Environmental Research Center, Lawrence Hall of Science, and ECHO Lake Aquarium and Science Center — but others have picked up the slack and gone ahead and built their own, based on our software and designs.

Figure 1: Augmented reality sandbox constructed by “Code Red,” Ithaca High School’s FIRST Robotics Team 639, and shown here at the school’s open house on 02/02/2013.

The newest addition to my External Installations page is “Code Red,” Ithaca High School’s FIRST Robotics Team 639, who just unveiled theirs at their school’s open house (see Figure 1), and were kind enough to send a note and some pictures, with many more “behind the scenes” pictures on their sandbox project page. There’s an article in the local newspaper with more information as well.

Together with Bold Park Community School’s, this is the second unveiled AR sandbox that I’m aware of. That doesn’t sound like much, but the software hasn’t been out for that long, and there are a few others that I know are currently in the works. And who knows how many are being built or are already completed that I’m totally unaware of; after all, this is free software. Team 639’s achievement, for one, came completely out of the blue.

Update: And I missed this Czech project (no, not that other Czech project that gave us the idea in the first place!). They built several versions of the sandbox and showed them off at hacker meets. And they say they’re currently trying to port the software to lower-power computers. Good on them!

Update 2: One more I missed, this time done by/for the Undergraduate Library at the University of Illinois, Urbana-Champaign. I don’t have any more information; but this is the YouTube video.

I should point out that these last two were news to me; I only found out about them after googling for “AR sandbox.”

So please, if you did build one and don’t mind, send me a note. 🙂 There’s a ready-made box awaiting your input right there ↓↓↓↓

Is VR dead?

No, and it doesn’t even smell funny.

But let’s back up a bit. When it comes to VR, there are three prevalent opinions:

  1. It’s a dead technology. It had its day in the early nineties, and there hasn’t been anything new since. After all, the CAVE was invented in ’91 and is basically still the same, and head-mounted displays have been around even longer.
  2. It hasn’t been born yet. But maybe if we wait 10 more years, and there are some significant breakthroughs in display and computer technology, it might become interesting or feasible.
  3. It’s fringe technology. Some weirdos keep picking at it, but it hasn’t ever led to anything interesting or useful, and never will.

Continue reading

How head tracking makes holographic displays

I’ve talked about “holographic displays” a lot, most recently in my analysis of the upcoming zSpace display. What I haven’t talked about is how exactly such holographic displays work, what makes them “holographic” as opposed to just stereoscopic, and why that is a big deal.

Teaser: A user interacting with a virtual object inside a holographic display.

Continue reading

ZSpace: a turn-key holographic display

Figure 1: A marketing image for the zSpace holographic display. Image taken from http://zspace.com.

I’ve been waiting for this for such a long time: a turn-key stereoscopic display with built-in pre-calibrated head tracking and tracked input device. We’ve been in the low-cost VR business for more than four years now, but the biggest problem is that our reference design is entirely DIY. Users have to go out and buy individual components, assemble them, and then — most importantly — calibrate them with respect to each other. This calibration step is the biggest hurdle for low-cost VR’s acceptance, because the idea behind it is somewhat hard to understand for VR non-experts, and even if it’s understood, it still requires expensive non-standard tools.

The solution, of course, is simple: instead of having the display and tracking system as separate entities that need to be calibrated with respect to each other, integrate them into the same frame, and pre-calibrate them at the factory. The only thing that had to happen was for a manufacturer to step up to the plate and make it so.

Voilá, I present the zSpace holographic display (see Figure 1).

Continue reading