I got interested in remote collaboration because I hate traveling, so it’s somewhat funny that I’ll be traveling all over the place in the near future to install remote collaboration-capable immersive display systems. I guess I brought it upon myself.
The first stop in the grand low-cost VR world tour was UC Santa Barbara, where I just finished installing a system following our blueprint, with some updates to account for the inexorable march onwards of technology. Lorraine Lisiecki, in the UC Santa Barbara department of Earth Science, is one of the collaborators in our NSF grant on studying paleoclimate, and leading one of the two remote sites that will be equipped with one of our “holo-phone” prototypes, the other one being Woods Hole Oceanographic Institution on Cape Cod, where I’ll travel next.
Here’s a video showing what it’s like to use a “holo-phone:”
In more detail, I installed a Sharp Aquos 70″ LCD 3D TV, a NaturalPoint OptiTrack 3-camera optical tracking system using their TrackingTools software, a Nintendo Wiimote as input device, and a PC with an Intel Core i7 “Ivy Bridge” CPU, an Nvidia GeForce GTX 680, 8GB RAM, a 128GB SSD, and Fedora 17 64-bit to run the display, and a little Dell with Windows 7 to run the tracking system (unfortunately, the tracking software neither runs under Wine nor in a virtual machine).
One of the biggest concerns going into this installation was how we would fit a 70″ 3D TV into a small faculty office. It’s not just that the TV itself is huge, but the tracking system cameras need to be mounted relatively far away from the TV so that their tracking volume covers the entire workspace in front of the TV. In the past, I have relied on high ceilings and mounted the cameras straight above the left and right edges of the TV and above the center, but that wasn’t an option in this case due to Lorraine’s office’s low 8.5 ft ceilings.
My solution was to push the left and right cameras further out, so that they look diagonally across the TV screen instead of straight down (see Figure 1). That turned out to cause more problems when designing “tracking antlers” for the tracked 3D glasses and the Wiimote, but it really helped increasing the tracking volume — we managed to track the entire screen surface — and had the side benefit of slightly increasing tracking accuracy due to the larger stereo reconstruction baseline.
Another, secondary, worry was the stereo quality of the TV. I had only seen this particular TV model in a store before, but due to the environment, and low-depth and low-contrast 3D content typically shown in store demos, it is impossible to judge how well a 3D TV will actually work once installed.
Stereo quality is primarily determined by the amount of cross-talk (or ‘ghosting”) between the left and right-eye views. Noticeable cross-talk interferes with the brain’s ability to fuse stereo pairs into 3D perception. LCD-based active-stereo 3D TVs such as this Sharp are problematic, because LCD pixels are relatively slow to switch from a full-on to a full-off state. In a high-contrast stereo pair, a pixel might have to switch from full-black to full-white 120 times per second. While that doesn’t sound so bad, one has to consider that in order to avoid cross-talk, a pixel that is white in one view and black in the other must look exactly as black as its neighboring pixel that’s black in both images, and as white as its other neighboring pixel that’s white in both images. Meaning that the pixel must have completed its switch during the short time the active stereo glasses do their own switch from opaque to transparent, or there will be a perceived brightness difference. The bottom line is that LCDs simply can’t do it, and the resulting stereo quality differs unpredicably between manufacturers, models, and potentially even individual units in the same model line. The work-arounds common in 3D movies, low contrast and limited eye separation, don’t work in immersive graphics because eye separation is directly determined by the user’s eye positions and not a free parameter, and high contrast is an important factor for effective scientific visualization.
That said, the stereo quality of this particular TV turned out to be OK, but not great. It is about on the same level as other LCD 3D TVs with which I’ve worked, about on par with passive-stereo 3D TVs (where cross-talk is caused by imperfect polarization, not pixel switch time), and significantly worse than DLP-based 3D projectors or projection TVs, where pixel switch time is simply not an issue, and the little cross-talk there is is caused by the 3D glasses. In the Nanotech Construction Kit, cross-talk manifested itself as a marble-like texture on the atomic building blocks, which looked strange but actually worked OK. In applications with point- or line-based rendering, white points or lines on a black background had borderline-worrisome cross-talk.
One unexpected issue with the 3D TV was that it refused to accept the 1080p frame-packed HDMI 1.4a video signal that I tried to send it. Since the TV doesn’t support Quincunx (“checkerboard”) interleaved stereo, I had to use a top-to-bottom stereo signal, which caused a loss in resolution, and anisotropy effects when using point or line primitives. Not ideal but workable, and a firmware upgrade might fix it.
Apart from that, there were the usual off-site installation issues, requiring several trips to the hardware or computer store to solve. There’s always a missing cable or power strip, or a small broken widget. What took the cake in this installation was the mouse that shipped with the Dell tracking PC: someone at Dell must have clipped the mouse cable with a set of pliers (see Figure 2); the broken mouse was wrapped in a cut plastic bag inside an undamaged cardbox container together with the keyboard.
In the final analysis, it took the first day to assemble the main PC and install Linux and all the goodies on it, and to install the tracking software on the Dell PC. Added difficulty: the tracking software insisted on calling home during installation, but getting a PC on the network at UCSB requires intervention from IT support staff, who had left for the day. So I needed to set up ad-hoc NAT on the main PC, which only had a single network interface card. After an unsuccessful trip to Best Buy to buy a second network card, I found a way to do NAT over a single interface (whew!).
The second day was spent mounting the TV and tracking cameras on Lorraine’s office wall, with help from friendly staff (thanks Tim!), and then aligning the tracking cameras and internally calibrating the system. The last part of that was calibration of the Vrui-side of the system using a Leica TCR407 power Total Station, with help from Lorraine’s grad student Carlye. When I thought I was done for the day was when the real trouble started, from unexpected problems calculating the calibration equations to random X server lock-ups and crashes caused by the Nvidia graphics driver. Fortunately I found the problem with the former, and a work-around for the latter.
The calibration issues were caused by the TV’s stupid default setting where incoming native 1920×1080 HDMI images are zoomed and then resampled to the 1920×1080 display engine. Who ever thought this was a good idea? Anyway, the result was that we measured the zoomed 2D test pattern, but because 3D mode doesn’t zoom, we got a screen size mismatch.
So we had to re-do the screen part of the calibration on the third day; fortunately, I hadn’t moved or taken down the Total Station yet, so we didn’t have to re-do the other two parts. The rest of the third day was spent cleaning up by moving the computers to their temporary “final” position (see Figure 3), sorting out several odds and ends, installing support scripts and launcher icons, and, most importantly, training the new users (Lorraine and Carlye, see Figure 4).
In the final analysis, it took three full days (8:15am-10:15pm, 7:30am – 2:00am, 9:15am – 5:00pm) to build, install, and calibrate the system entirely from scratch and give the users enough initial training to get them going, without any site preparation except having the components already on site. There are a few outstanding issues, such as the Wiimote requiring to push the reset button instead of 1+2 to connect, but those can be solved remotely in the future.
It turned out as a very good system, with OK stereo quality, ample screen space, very good tracking volume and excellent tracking calibration (not more than 1mm discrepancy between tracking marker and virtual indicator over the entire space). The TrackingTools software has improved a lot since the early versions I use in my own system; after moving away from Euler angles to represent orientations it no longer suffers from gimbal lock, and performance and accuracy have increased overall. The software is a lot more automated as well, requiring literally zero interaction after double-clicking the icon to get up and running.
Having run into a lot of new issues, I am now hoping that knowing them ahead of time will make the next off-site installation go smoother and take less time. Here’s hoping.
Pingback: Immersive visualization of past ocean flow patterns | Doc-Ok.org