Archaeologists use LiDAR to find lost cities in Honduras

I wasn’t able to talk about this before, but now I guess the cat’s out of the bag. About two years ago, we helped a team of archaeologists and filmmakers to visualize a very large high-resolution aerial LiDAR scan of a chunk of dense Honduran rain forest in the CAVE. Early analyses of the scan had found evidence of ruins hidden under the foliage, and using LiDAR Viewer in the CAVE, we were able to get a closer look. The team recently mounted an expedition, and found untouched remains of not one, but two lost cities in the jungle. Read more about it at National Geographic and The Guardian. I want to say something cool and Indiana Jones-like right now, but I won’t.

Figure 1: A “were-jaguar” effigy, likely representing a combination of a human and spirit animal, is part of a still-buried ceremonial seat, or metate, one of many artifacts discovered in a cache in ruins deep in the Honduran jungle.
Photograph by Dave Yoder, National Geographic. Full-resolution image at National Geographic.

Continue reading

I Can’t Ever Get Over This Mars Thing, Can I?

I have talked about KeckCAVES’ involvement in the Curiosity Mars Rover missions several times before, but I just found a set of cool pictures that I have not shared yet. I just saw a reddit thread about a VR application to walk on the moon, one commenter asked about doing the same for Mars, and one thing led to another.

Can an application like that be done for Mars? Do we have enough data, and are the data publicly available? The answers are “yes, already done,” “kinda,” and “yes, but,” respectively.

As of my last checking, there are two main sources of topography data for Mars. The older source is an orbital laser range survey done by the Mars Orbiter Laser Altimeter (MOLA). This is essentially a planetary LiDAR scan, and can be visualized using LiDAR Viewer. The two pictures I mention above are these (Figures 1 and 2):

Figure 1: Global visualization of Mars topography using the MOLA data set, rendered using LiDAR Viewer. Vertical scale is 5:1.

Figure 2: Close-up of global Mars topography data set (centered on the canals), showing individual laser returns as grey dots. The scan lines corresponding to individual orbital periods can clearly be identified. Vertical scale is 5:1.

Continue reading

More on Desktop Embedding via VNC

I started regretting uploading my “Embedding 2D Desktops into VR” video, and the post describing it, pretty much right after I did it, because there was such an obvious thing to do, and I didn’t think of it.

Figure 1: Screenshot from video showing VR ProtoShop run simultaneously in a 3D environment created by an Oculus Rift and a Razer Hydra, and in a 2D environment using mouse and keyboard, brought into the 3D environment via the VNC remote desktop protocol.

Continue reading

2D Desktop Embedding via VNC

There have been several discussions on the Oculus subreddit recently about how to integrate 2D desktops or 2D applications with 3D VR environments; for example, how to check your Facebook status while playing a game in the Oculus Rift without having to take off the headset.

This is just one aspect of the larger issue of integrating 2D and 3D applications, and it reminded me that it was about time to revive the old VR VNC client that Ed Puckett, an external contractor, had developed for the CAVE a long time ago. There have been several important changes in Vrui since the VNC client was written, especially in how Vrui handles text input, which means that a completely rewritten client could use the new Vrui APIs instead of having to implement everything ad-hoc.

Here is a video showing the new VNC client in action, embedded into LiDAR Viewer and displayed in a desktop VR environment using an Oculus Rift HMD, mouse and keyboard, and a Razer Hydra 6-DOF input device:

Continue reading

KeckCAVES in the News

A cluster of earthquakes always gets the news media interested in geology, at least for a short time, and Monday’s 4.4 in southern California, following last week’s series of north coast quakes up to 6.9, was no different. Our local media’s go-to guy for earthquakes and other natural hazards is Dr. Gerald Bawden of the USGS Sacramento. Gerald also happens to be one of the main users of the KeckCAVES visualization facility and KeckCAVES software, and so he took an interview with our local Fox-affiliate in the CAVE, “to get out of the wind,” as he put it.

Here’s the video. Caution: ads after the jump.

Continue reading

An Early Easter Egg

I always love it when an image I made, or a photograph I took, pops up in an unexpected context. I have recently been drafted for a campus committee, the Campus Council for Information Technology, and the guest speaker in today’s meeting, Patrice Koehl, did a presentation on Big Data, Data Analytics, the need for data centers, the lack of collaboration between computer scientists and other scientists, etc. Among his slides was this one (apologies for the lousy quality; I took the picture with my laptop’s built-in camera):

Figure 1: Snapshot from today’s CCFIT meeting, showing Patrice Koehl, one of his slides, and one of my pictures on one of his slides (indicated by red frame).

Continue reading

When the novelty is gone…

I just found this old photo on one of my cameras, and it’s too good not to share. It shows former master’s student Peter Gold (now in the PhD program at UT Austin) working with a high-resolution aerial LiDAR scan of the El Mayor-Cucapah fault rupture after the April 2010 earthquake (here is the full-resolution picture, for the curious).

Figure 1: Former master’s student Peter Gold in the CAVE, analyzing a high-resolution aerial LiDAR scan of the El Mayor-Cucapah fault rupture.

Continue reading

Oh, the places you’ll go!

Hey look! A still frame of an animated visualization I created of a CAT scan of a fragment of the meteorite that landed close to Sutter’s Mill in Northern California almost a year ago made the cover of Microscopy Today. Here’s a link to the original post I wrote back in December 2012, and because it’s really pretty, and all grown up and alone out there in the world now, here is the picture in question again:

Figure 1: X-ray CT scan of Sutter’s Mill meteorite fragment.

To quickly recap from my original post, the CAT scan of this meteorite fragment was taken at the UC Davis Center for Molecular and Genomic Imaging, and then handed to me for visualization by Prof. Qing-zhu Yin of the UC Davis Department of Geology. The movies I made were to go along with the publication of Qing-zhu’s and his co-authors’ paper in Science.

I thought I did a really good job with the color map, given that that’s not normally my forte. The icy blue — dark blue gradient nicely brings out the fractures in the crust, and the heavy element inclusions stand out prominently in gold (Blue and gold? UC Davis? Get it?). You can watch the full video on YouTube. I’d link to Qing-zhu’s own copy of the video, but it has cooties, I mean ads on it, eww.

And as can be seen in a full-page ad on page 31 of the same issue of Microscopy Today, apparently my picture — no doubt by virtue of the 3D meteorite fragment scan shown in it — was one of the winners in a “coolest thing you’ve never seen” contest held by the company who made the X-ray CT scanner. My little picture is Miss September 2013. Hooray, I guess?

Low-cost VR for materials science

In my ongoing series on VR’s stubborn refusal to just get on with it and croak already, here’s an update from the materials science front. Lilian Dávila, former UC Davis grad student and now professor at UC Merced, was recently featured in a three-part series about cutting-edge digital research at UC Merced, produced by the PR arm of the University of California. Here’s the 10-minute short focusing on her use of low-cost holographic displays for interactive design and analysis of nanostructures:

Continue reading

Of CAVEs and Curiosity: Imaging and Imagination in Collaborative Research

On Monday, 03/04/2013, Dawn Sumner, one of KeckCAVES‘ core members, gave a talk in UC Berkeley‘s Art, Technology, and Culture lecture series, together with Meredith Tromble of the San Francisco Art Institute. The talk’s title was “Of CAVEs and Curiosity: Imaging and Imagination in Collaborative Research,” and it can be viewed online (1:12:55 total length, 50 minutes talk and 25 minutes lively discussion).

While the talk is primarily about the “Dream Vortex,” an evolving virtual reality art project led by Dawn and Meredith involving KeckCAVES hardware (CAVE and low-cost VR systems) and software, Dawn also touches on several of her past and present scientific (and art!) projects with KeckCAVES, including her work on ancient microbialites, exploration of live stromatolites in ice-covered lakes in Antarctica, our previous collaboration with performing artists, and — most recently — her leadership role with NASA‘s Curiosity Mars rover mission.

The most interesting aspect of this talk, for me, was that the art project and all the software development for it, are done by the “other” part of the KeckCAVES project, the more mathematically/complex systems-aligned cluster around Jim Crutchfield of UC DavisComplexity Sciences Center and his post-docs and graduate students. In practice, this means that I saw some of the software for the first time, and also heard about some problems the developers ran into that I was completely unaware of. This is interesting because it means that the Vrui VR toolkit, on which all this software is based, is maturing from a private pet project to something that’s actually being used by parties who are not directly collaborating with me.