It’s funny, suddenly the idea of virtual sculpting or virtual clay modeling using 3D input devices is popping up everywhere. The developers behind the Leap Motion stated it as their inspiration to develop the device in the first place, and I recently saw a demo video; Sony has recently been showing it off as a demo for the upcoming Playstation 4; and I’ve just returned from an event at the Sacramento Hacker Lab, where Intel was trying to get developers excited about their version of the Kinect, or what they call “perceptual computing.” One of the demos they showed was — guess it — virtual sculpting (one other demo was 3D video of a person embedded into a virtual office, now where have I seen that before?)
So I decided a few days ago to dust off an old toy application (I showed it last in my 2007 Wiimote hacking video), a volumetric virtual “clay” modeler with real-time isosurface extraction for visualization, and run it with a Razer Hydra controller, which supports bi-manual 6-DOF interaction, a pretty ideal setup for this sort of thing:
I didn’t turn the 3D TV into stereo mode to make the video look better, but that also made the program somewhat harder to use. That is not an excuse for my pitiful modeling skills; those are all on me. If you want to see what actually talented people can achieve with a relatively simple tool like this, watch the PS4 demo video I linked above again. It’s hard to tell with their quick cuts and blending between different models, but the only tools I could make out were straight material deposition and subtraction, and maybe smoothing. I don’t think I even saw dragging. But they appear to have color painting, which is a definite plus. Need to add that.
I didn’t know about the Intel Hacker Lab thing when I recorded and uploaded the video; that was literally two hours before I left to go there. The reason why I made the video now, besides adding my $0.02 to the conversation, is that I finally publicly released the sculpting application. It’s now part of the example programs bundled with the newest version of the Vrui VR development toolkit (version 2.7-001). I had mostly forgotten about it, but with the current buzz, maybe someone artistically inclined will pick it up and do something cool with it. To help with that, I added the option to export the current isosurface as a mesh file in PLY format, so it can be imported into 3D modeling applications such as blender.
From the technical side, the idea behind the application is quite simple. It is a 3D analogon to 2D painting programs like Photoshop or the GIMP: instead of depositing paint onto a 2D canvas, one deposits material into a 3D grid. What is then actually visible on the screen is not the 3D blob of material itself, but its boundary surface, technically the level set or isosurface of all points inside the 3D grid’s domain that have a fixed material density value c (in this particular case, c=0.5).
The nice thing about representing surfaces as level sets of 3D density fields is that, after the heavy lifting of extracting the surface from the density field in real time is taken care of, the surface becomes easily manageable, and that operations on the surface can be very intuitive. For example, it is very easy to change the topology of a level set surface, say by drilling a hole all the way through a solid blob. The connectivity of the surface is implied by the grid, which makes it trivial to fuse different parts of the surface automatically during editing operations. One can take one blob and cleave it into two or any number of pieces, take two or more pieces and join them by drawing a connection between them, and so forth.
But the real power of this sculpting paradigm comes from the operations that can be applied to level set surfaces. Instead of working with them in terms of vertices, edges, or faces, one can simply create new surfaces out of thin air by depositing material, erase existing material, smooth the surface locally by applying a diffusion or low-pass filter to the underlying 3D density field, or pull, push, or mold the surface by advecting the density field along the moving direction of an input device. These are just the four most simple operations, the ones currently implemented in my sculpting application. There are many more of these that could be added.
In more detail, new material is deposited by enumerating all grid vertices that are inside the tool shape, currently a sphere of selectable radius, and setting those vertices’ density values to a constant larger than c. Material is erased by doing the same, but with a constant smaller than c. Doing it straight up, alas, would lead to severe aliasing — the resulting surface would look faceted (see the Leap Motion and Intel demos). This is a very common misconception: most people think that level set surfaces must look faceted, because they are defined on a discrete grid. Not correct; the faceted look is an artifact from doing it wrong.
The trick is to create a “fudge zone” around the actual boundary of the tool shape, where assigned density values inside that zone are not constant, but proportional to any point’s distance from the actual surface. If the “fudge zone” is larger than a few grid voxels, it will effectively remove aliasing. Note the sphere I create in the video; it looks like a perfect sphere, not a sphere made from Lego bricks. Mathematically speaking, adding such a fudge zone will turn the shape function of the tool from an infinite-bandwidth function to a band-limited function, which can be sampled without artifacts. The artifacts only creep back in when the tool shape shrinks to approximately the same size as the grid’s voxels (which I’m showing later in the video, when drilling very small holes).
Oh, and if you’re wondering how the real-time isosurface extraction works: I’m not gonna tell you. Use the source, Luke!
And yet another aside: here is the reason why I open myself up to ridicule by uploading single-take, uncut, unrehearsed, unedited video of work-in-progress software to YouTube (this one is a prime example). User Tiago Estill de Noronha saw the clay modeling video and suggested a better way of managing the editing tool’s different modes, using the joystick’s four primary directions to select between them, in addition to the radio button way. Done, added to the software, released, thank you!