In my detailed how-to guide on installing and configuring Vrui for Oculus Rift and Razer Hydra, I did not talk about installing any actual applications (because I hadn’t released Vrui-3.0-compatible packages yet). Those are out now, so here we go.
If you happen to own a Kinect for Xbox (Kinect for Windows won’t work), you might want to install the Kinect 3D Video package early on. It can capture 3D (holographic, not stereoscopic) video from one or more Kinects, and either play it back as freely-manipulable virtual holograms, or it can, after calibration, produce in-system overlays of the real world (or both). If you already have Vrui up and running, installation is trivial.
Assuming that you installed Vrui as laid out in my guide, open a terminal and enter the following:
> cd src > wget -O - http://stout.idav.ucdavis.edu/Software/Kinect/Kinect-2.7.tar.gz | tar xfz - > cd Kinect-2.7 > make && make install > cd
This will download and unpack the Kinect tarball (.tar.gz archive), enter the source directory, and build and install the package. It will automatically end up in the ~/Vrui-3.0 directory, because it’s considered an add-on to Vrui itself. If you installed Vrui in a different location than ~/Vrui-3.0, replace the make command with:
> make VRUI_MAKEDIR=<path to Vrui>/share/make > make VRUI_MAKEDIR=<path to Vrui>/share/make install
where <path to Vrui> is the full directory name where you installed Vrui, such as /usr/local/Vrui-3.0. Or you can edit the “VRUI_MAKEDIR := $(HOME)/Vrui-3.0/share/make” line in the makefile; it’s up to you.
Ideally you want to calibrate your Kinect intrinsically before using it (intrinsic calibration derives the camera parameters, such as focal length, skew, and lens distortion, and the alignment projection between the depth and color streams), but it works so-so even without calibration, thanks to the factory calibration data stored in the firmware of each Kinect. But for advanced uses, calibration is explained in detail in the Kinect package’s README file, elsewhere on this blog, and in a series of YouTube videos.
There are two main applications: RawKinectViewer and KinectViewer. RawKinectViewer is mostly a calibration and testing utility. It shows the raw depth and color streams of a single Kinect side-by-side and is not only the central utility for intrinsic and extrinsic (placement and orientation of the camera in world space) calibration, but also useful for aiming a Kinect camera at a target object. RawKinectViewer doesn’t require any command line options. By default, it connects to the first Kinect device connected to the host computer. If you have multiple Kinects, you can select which one you want by passing a zero-based index on the command line, such as “RawKinectViewer 2” to connect to the third Kinect device.
KinectViewer is a full 3D viewer for real-time virtual holograms. You connect it to a Kinect camera by passing -c <device index> on the command line, where <device index> is a zero-based index like in RawKinectViewer. You can connect to multiple Kinects by passing multiple -c <deviceIndex> options. For example, to connect to the first and third Kinect camera on your computer, run:
> KinectViewer -c 0 -c 2
Being a Vrui application, KinectViewer can be run in Oculus Rift mode, with or without Razer Hydra devices, just like any other:
> KinectViewer -c 0 -c 2 -mergeConfig OculusRift.cfg -mergeConfig RazerHydra.cfg
KinectViewer’s main menu gives you access to a “streamer dialog” for each connected Kinect device. Here you can toggle the streams on/off, capture a background image for depth-based background removal, and change renderer settings.
The main menu also has a “Save Streams…” entry. This one will act as a holographic video recorder, and dump all data streams coming from all Kinects to a set of compressed files with a common prefix selected via a file selection dialog. Later on, you can play back such recorded streams by passing -f <file name prefix> on KinectViewer’s command line. You can then watch the recordings from any angle, point of view, or scale.
Nanotech Construction Kit
The Nanotech Construction Kit (or NCK for short) is a great application to get your immersive modeling on. It’s a very simple application, and can be used to build very complex structures with little practice. It’s also easy to install. Again, assuming the default Vrui installation:
> cd src > wget -O - http://stout.idav.ucdavis.edu/Software/NCK/NCK-1.9.tar.gz | tar xfz - > cd NCK-1.9 > make INSTALLDIR=$HOME/Vrui-3.0 && make INSTALLDIR=$HOME/Vrui-3.0 install > cd
For simplicity, this will install NCK right in Vrui’s installation directory. I normally recommend keeping Vrui applications separate from Vrui, but it’s OK here. If you installed Vrui elsewhere, or want to install NCK elsewhere, you know what to do.
To start NCK in Oculus Rift / Razer Hydra mode, type
> NanotechConstructionKit -mergeConfig OculusRift.cfg -mergeConfig RazerHydra.cfg
When the program starts, it shows nothing but an empty box drawn as wireframe. This box is the simulation domain: all building blocks (atoms or base units) will live inside this box. The first thing to do is bind 6-DOF dragging tools to one button on each Hydra handle. I generally recommend using the upper shoulder buttons (“LB” and “RB”). To bind a tool to LB: 1) press and hold LB and a tool selection menu pops up; 2) move the handle to select “6-DOF Dragger” from the “Dragger” submenu; 3) release LB. Repeat the process for RB.
Next, set up the program to create new building blocks when you press LB or RB without touching an existing building block. Press and hold either “2” on the left handle or “1” on the right handle (depending on whether you’re left- or right-handed) and the program’s main menu pops up. From the “Structural Unit Types” submenu, select “Triangle” to build fullerenes like Buckyballs or nanotubes, or “Tetrahedron” to build silica crystals like quartz or feldspar, and release the menu button to confirm. Finally, go to the main menu one more time, and select “Show Unlinked Vertices” from the “Rendering Modes” submenu.
From now on, whenever you press LB or RB in empty space, a new building block will magically appear out of thin air. To pick up and move an existing building block, move a Hydra handle so that the tip of the grey cone pokes the grey building block (it’s easiest to aim for dead center), and press and hold the assigned dragger button on that handle (LB or RB).
It’s tempting to aim for the red spheres when grabbing building blocks, but resist. Those don’t count for picking up building blocks. Instead, the red spheres are unsatisfied bonding sites. To bond two building blocks, move them so that two of their red spheres touch, and they’ll bond and snap into place. That’s really all there’s to it.
One important note: the current version of the Nanotech Construction Kit runs synchronously, meaning that there will be a rendering pass after every simulation step. This is not good if your OpenGL driver is set up to synchronize with your display’s vertical retrace, because then there will be at most 60 frames, and therefore 60 simulation steps, per second. Molecules will feel like jelly. Try turning off retrace synchronization. Using the Nvidia driver, for example, open nvidia-settings, go to the “OpenGL Settings” tab, and uncheck the “Sync to VBlank” box. This will make NCK immensely more snappy. And typically there won’t be any problems with image tearing either. The 2.0 version of NCK will finally render asynchronously, so it won’t be a problem any longer.
CAVE Quake III Arena
This oldie but goodie is the unofficial “Hello, World” of VR. It’s also one of the very few Vrui applications not developed for Vrui from the ground up. Instead, I took the existing CAVE Quake III Arena application developed by VisBox‘s Paul Rajlich (who, in turn, adopted Stephen Taylor’s Aftershock renderer), and replaced all the window management and user interface mechanisms with Vrui. I mentioned in a previous post that Vrui, while not a game engine per se, would make great infrastructure for one. CAVE Quake III Arena (which isn’t really a game engine either, but let’s make believe for now) was an early test case that worked out well.
To run this application, you not only need the source code, but also the game files from the original retail release of Quake III Arena. You can probably pick that up for a buck fifty from a yard sale. It’s for sale on Steam as well, but it might not work because Steam probably wraps it in DRM. You’ll need the “raw” pak0.pk3 main game file. I checked Good old Games, but they don’t seem to have it. If you managed to score a pak0.pk3 file somehow (I still have the disk from when I bought the game back in ’99 — amazingly, the native Linux version came out just three weeks after the Windows version), here’s how to install and run CAVE Quake III Arena:
> cd src > wget -O - http://stout.idav.ucdavis.edu/Software/CaveQuake/CaveQuake-2.1.tar.gz | tar xfz - > cd CaveQuake-2.1 > make INSTALLDIR=$HOME/Vrui-3.0 && make INSTALLDIR=$HOME/Vrui-3.0 install > cd > cq3a maps/q3dm4.bsp -mergeConfig OculusRift.cfg -mergeConfig RazerHydra.cfg
The Aftershock renderer is not full-featured and somewhat buggy, and as a result some of the original levels don’t work properly. maps/q3dm4.bsp is my personal favorite of the ones that do. Some hero coder should do what I did to Paul’s CAVE Quake III Arena and take the now-released original id Software source code, and graft it on top of Vrui. That would be really neat. After 14 years, Quake III Arena is still a very fun game, and very different from modern shooters.
There are a variety of ways to get around the game levels. The default two-handed 6-DOF navigation tool mapped to the Hydra doesn’t work well at all. You don’t get the feeling of being in a real place (as “real” as a Quake III level can be, anyway) if you can pick up the world with your hands. Instead, use any of the surface-based Vrui navigation tools. But if you’ve already grabbed the world and moved it, it is probably nauseatingly tilted at this point. Simply open the main menu, and select “Reset Navigation” to fix that.
To get started, put the Hydra away for a second and use Vrui’s standard first-person navigation tool. Press “q” to activate it, which will show a minimalistic HUD with a typical FPS reticle and an overhead compass. The HUD is rendered at an appropriate position and distance in the 3D view (which is configurable, of course). It’s supposed to show up green, but there is some OpenGL state management bug deep inside the Aftershock engine that I haven’t fixed, so it shows up black at the beginning. It’ll correct itself later.
Once active, the FPS navigation tool works like, well, an FPS navigation tool. Press w to walk forward, s to backpedal, a or d to strafe, and the space bar to jump. Use the mouse to rotate your view left and right. Moving the mouse up or down will not rotate the view up or down, because that’s explicitly disabled in the OculusRift.cfg configuration file. If you want to check, open that file and find this section:
section FPSNavigationTool rotateFactors (48.0, 0.0) hudDist 144.0 hudRadius 72.0 hudFontSize 2.0 endsection
You can see how the second rotate factor is set to 0.0, because rotating the view up/down with the mouse while wearing an HMD will make you sick. If you want to try, change the number to 64.0, grab a bucket, and go ahead. This section, by the way, is also where you can change the HUD position, layout, and color (by adding a “hudColor (<red>, <green>, <blue>)” tag. For all the other configurable settings, check the “Vrui Tool Configuration File Settings Reference” document in Vrui’s HTML documentation.
The second tool to try is the “Valuator Fly and Turn” navigation tool that’s already bound to the joysticks on both Hydra handles. Turn off the FPS tool by pressing q again, and gently push forward on one of the joysticks. You’ll start gliding in the direction in which you’re pointing the handle whose joystick you pushed. Pulling the stick back will move backwards; pushing it left or right will rotate the view in that direction.
Another tool to try is the surface-aligned “Walk & Valuators” tool, which can be mapped to the joysticks on one of the Hydra handles. First you need to unbind the tool already there. Look for the small red square in the lower-right hand corner of your forward view. That’s Vrui’s “tool kill zone,” it’s used to delete unwanted tools. Hold one of the Hydra handles so that the tip of its cone appears in front of the red box, and push that handle’s joystick forward. This will show a small box with text indicating what tool is currently bound to that joystick axis. If you let go of the stick again while still holding the handle over the kill zone, the tool will be destroyed. Now you can assign a new tool by pressing down on the stick (as if it were a button) and holding it. This will open a tool selection menu; from the “Navigation” -> “Surface-aligned Navigation” submenu, select “Walk & Valuators,” and let go of the stick. This will bring up a dialog prompting you to assign the left/right function. Push the same stick to the right, and let it go again. You’re now prompted for the forward/backward function; push the stick forward and let go. Finally, you’ll see a “Jetpack” function; ignore it, and finish tool creation by pressing down on the stick again.
You can activate this new tool by pressing down on the stick. This will show a similar HUD, and if you look at your feet, you’ll see two concentric circles and a wedge. To move around, push the stick in any direction, and you’ll glide in that direction. To rotate your view, look left or right. The more you look to your left, the faster the world will rotate to your right, and vice versa. The bottom line is that whatever direction in the world you want to look will end up aligned with the ideal forward direction, i.e., facing your keyboard and monitor. The movement speed, rotation speed, and angle dead zone can all be configured via OculusRift.cfg.
The last thing you can do is create a weapon. Assign a “Locator” -> “6-DOF Locator” tool to any button on one of the Hydra handles. This will probably create a shimmering purple box that’s attached to your hand (and, incidentally, it will turn the HUDs green, and the main menu grey, as they should be). Since you don’t want a box, delete the tool again by holding the Hydra handle over the tool kill zone and pressing the same button. Every time you bind a 6-DOF locator, you will cycle through the available weapon models. After a few iterations, you might end up with a shotgun, or a BFG. Depending on the level, not all these models will be defined, which is where the purple boxes come from. Stupid bug.
The weapons don’t do anything, but they’re still fun to wave around. You can also create a more elegant weapon, for a more civilized age, by binding a “Pointer” -> “Jedi Tool” to an arbitrary button. Pressing that button afterwards will toggle the tool on/off. Sound effects to be supplied by the user.