On the road for VR part II: Tahoe Environmental Research Center, Incline Village, Lake Tahoe

We have been collaborating with the UC Davis Tahoe Environmental Research Center (TERC) for a long time. Back in — I think — 2006, we helped them purchase a large-screen stereoscopic projection system for the Otellini 3-D Visualization Theater and installed a set of Vrui-based KeckCAVES visualization applications for guided virtual tours of Lake Tahoe and the entire Earth. We have since worked on joint projects, primarily related to informal science education. Currently, TERC is one of the collaborators in the 3D lake science informal science education grant that spawned the Augmented Reality Sandbox.

The original stereo projection system, driven by a 2006 Mac Pro, was getting long in the tooth, and in the process of upgrading to higher-resolution and brighter projectors, we finally convinced the powers-that-be to get a top-of-the line Linux PC instead of yet another Mac (for significant savings, one might add). While the Ubuntu OS and Vrui application set had already been pre-installed by KeckCAVES staff in the home office, I still had to go up to the lake to configure the operating system and Vrui to render to the new projectors, update all Vrui software, align the projectors, and train the local docents in using Linux and the new Vrui application versions.

My family and I went up to the lake last Sunday, where I spent half the day getting the Nvidia graphics drivers configured to display to three devices (a 1920×1080 desktop monitor as the console, and the two 1920×1080 BenQ projectors) in a reliable fashion. The main problems were twofold: for one, the display is a mixed-use system. It not only runs live Vrui applications, but must also be able to show pre-rendered stereoscopic movies in side-by-side 3840×1080 format. Unlike Vrui’s run-time environment, movie players are generally not configurable at all, and pre-rendered stereo movies are especially unwieldy. So instead of using my standard setup of separate X screens for the console and each projector, I had to use a wide-desktop (TwinView) setup.

Which led to the second problem: in the past, I have found TwinView rather unreliable, as the multi-screen layout can suddenly change if projector cables are unplugged, or one or both projectors happen to be powered off when the X server starts, etc. Worst of all, users can screw up the layout by messing with the graphical nvidia-settings utility. It turns out that Nvidia have recently done some work on this, and I was finally able to get X configured to always start up with the same screen layout, no matter what cables are unplugged, and where curious users click inside nvidia-settings. This was crucial, because a change in display layout forces a configuration change in Vrui that is very hard to do remotely, because one needs to be able to see the display in stereo to figure out which graphics window needs to be sent where on the desktop. Having gotten TwinView to work is quite convenient, as recent versions of the Gnome desktop have started to react poorly to multiple X screens, such as panels replicating like bunnies, and getting plastered everywhere onto the desktop like so many browser toolbars. So I ended up using a TwinView (which is now a slight misnomer) triple-width 5760×1080 desktop, with the console on the left.

Here’s the trick: forget nvidia-settings (which doesn’t seem to match the capabilities of the newest drivers, which cost me some more hair before I figured that out), and do everything via xorg.conf. Follow Nvidia’s driver README to set up TwinView, and then insert new options to make it rock solid: select which display is going to be the console, i.e., will show the login screen, panels, and menus, via nvidiaTwinViewOrder, and set up a custom Xinerama range so that applications like xine or Vrui will automatically full-screen over both projectors using nvidiaXineramaInfoOverride (use xine -F to full-screen using Xinerama). To avoid problems with powered-down or unplugged projectors, use connectedMonitors and useDisplayDevice, and disable EDID queries and provide custom frequency ranges and modelines. A bit more effort than clicking in nvidia-settings, but then a setup like this is not exactly vanilla, especially if it absolutely needs to “just work.”

Due to Vrui’s ridiculous level of configurability, the only problem left was to find a movie player that could be scripted to the point where it wasn’t necessary to manually drag the playback window over to the projector screens and maximize it to make stereo work. This is crucial, because dragging a movie window across side-by-side stereo screens while visitors are already wearing stereo glasses will make their eyeballs explode.

Turns out modern X software no longer recognizes any hints as to where to place their windows (-geometry rest in peace), so neither the standard Totem video player nor VLC or ffplay could be used. Totem, by the way, claims remote control ability via command line, but reliably crashes every single time any remote command is issued. So much for that. I ended up going completely old-school and installing xine, whose configurability almost rivals that of Vrui. There were even options in the GUI to set up default playback window positions and sizes, so I was able to write a script that starts xine in fullscreen stereo mode on both projectors, while showing the control GUI on the console, every time. Ideal for the purpose! Now, xine tends to crash on shutdown, but fortunately that’s only a minor annoyance.

The second big Sunday task was aligning the two new projectors to exactly fill the display screen, and create the best possible stereo display. But that, or rather the correct approach to doing it, will be the topic for a separate post.

I finished the day by updating Vrui and all Vrui applications, in preparation for training day on Monday.

Monday morning I spent working on input device presets for the individual applications. TERC uses a wireless Logitech Wingman Extreme 3D joystick to control the Vrui applications, to allow docents to hold guided virtual tours for larger audiences. The joystick-compatible navigation and interaction tools in the old version of Vrui previously installed in the theater were a little hinky, and a cause of much complaining. But I had been working on much more streamlined tools for a while, and finally had the chance to install and configure them. The problem was that these configurations are another thing that can’t be done remotely. One has to sit in the space with the joystick in hand to get the sensitivity settings that are just right — which I did that day. The new tool structure in Vrui 2 (yes, their software hadn’t been updated in that long) allowed me to put four different tools onto the joystick: a virtual mouse that allows users to control software by interacting with pop-up menus and dialog boxes; a world-in-hand 6-DOF navigation tool to move 3D objects arbitrarily in position and orientation using all joystick axes and several buttons (optimal for the 3D Earth viewer); an arcade-style airplane “simulator” to fly over large-scale virtual terrain such as the Lake Tahoe DEM in Terrain Viewer or Crusta or the Tahoe watershed LiDAR scan; and, mostly as an easter egg, Vrui’s built-in helicopter simulator. Users can switch between these tools anytime by pressing either of four buttons on the joystick’s base.

In retrospect, I’m convinced that the new Linux system is much more stable and reliable than the old Mac setup, and also much easier to use for its limited purpose. This display system is an appliance, not a general-purpose computer, and there’s a reason that computing appliances like Tivos and the like are based on Linux and not Mac OS, and it’s not just because Linux is free. It might be more difficult to set up Linux exactly right in the first place, but once it’s set up correctly, it stays that way. In the new system, the docents find a login screen when they step up to the computer. They type a simple password (which will probably end up being taped to the screen, and I don’t want to know it), and get to a desktop with several icons that they simply double-click to start any of the pre-rendered movies, or the pre-installed interactive applications. They can then grab any, or even multiple, of the configured input devices (keyboard/mouse, joystick, and an additional 6-DOF spaceball that had been there all along but was never used), and they will all just work.

Granted, being able to control a complex 3D application proficiently with any input device takes practice, but this is not really different from all the other demonstration equipment that the docents deal with on a daily basis. Looking at it, the joystick has fewer buttons and knobs than some of the other lab equipment standing around.

It’s still too early for detailed feedback, but I’m sure that the docents will be much more comfortable with the new system and device setup than with the old, and that the 3-D theater as a whole will be a much better experience for visitors as a result. I hope that the new software will reduce the reliance on pre-rendered stereo movies, which are really not all that impressive anymore (not talking about the content of the movies, just the medium itself).

There’s really only one thing I’m not entirely happy with: TERC’s display screen has a 4:3 aspect ratio, and the new projectors are 16:9. Instead of blowing out the sides of the image to fill the entire screen with 1440×1080 pixels, a decision was made to only project to the upper 3/4ths of the screen, with a resolution of 1920×1080 pixels. This means there is a big chunk of black at the bottom of the screen, and reduced vertical field-of-view. In practice this means the interactive applications are not as immersive as they could be, as it’s now harder to look straight ahead instead of slightly up, and therefore harder to perceive the 3D models as real environments that one can travel through. But it’s their display, and that’s that.

Leave a Reply

Your email address will not be published. Required fields are marked *