Virtual clay modeling with 3D input devices

It’s funny, suddenly the idea of virtual sculpting or virtual clay modeling using 3D input devices is popping up everywhere. The developers behind the Leap Motion stated it as their inspiration to develop the device in the first place, and I recently saw a demo video; Sony has recently been showing it off as a demo for the upcoming Playstation 4; and I’ve just returned from an event at the Sacramento Hacker Lab, where Intel was trying to get developers excited about their version of the Kinect, or what they call “perceptual computing.” One of the demos they showed was — guess it — virtual sculpting (one other demo was 3D video of a person embedded into a virtual office, now where have I seen that before?)

So I decided a few days ago to dust off an old toy application (I showed it last in my 2007 Wiimote hacking video), a volumetric virtual “clay” modeler with real-time isosurface extraction for visualization, and run it with a Razer Hydra controller, which supports bi-manual 6-DOF interaction, a pretty ideal setup for this sort of thing:

I didn’t turn the 3D TV into stereo mode to make the video look better, but that also made the program somewhat harder to use. That is not an excuse for my pitiful modeling skills; those are all on me. If you want to see what actually talented people can achieve with a relatively simple tool like this, watch the PS4 demo video I linked above again. It’s hard to tell with their quick cuts and blending between different models, but the only tools I could make out were straight material deposition and subtraction, and maybe smoothing. I don’t think I even saw dragging. But they appear to have color painting, which is a definite plus. Need to add that.

I didn’t know about the Intel Hacker Lab thing when I recorded and uploaded the video; that was literally two hours before I left to go there. The reason why I made the video now, besides adding my $0.02 to the conversation, is that I finally publicly released the sculpting application. It’s now part of the example programs bundled with the newest version of the Vrui VR development toolkit (version 2.7-001). I had mostly forgotten about it, but with the current buzz, maybe someone artistically inclined will pick it up and do something cool with it. To help with that, I added the option to export the current isosurface as a mesh file in PLY format, so it can be imported into 3D modeling applications such as blender.

From the technical side, the idea behind the application is quite simple. It is a 3D analogon to 2D painting programs like Photoshop or the GIMP: instead of depositing paint onto a 2D canvas, one deposits material into a 3D grid. What is then actually visible on the screen is not the 3D blob of material itself, but its boundary surface, technically the level set or isosurface of all points inside the 3D grid’s domain that have a fixed material density value c (in this particular case, c=0.5).

The nice thing about representing surfaces as level sets of 3D density fields is that, after the heavy lifting of extracting the surface from the density field in real time is taken care of, the surface becomes easily manageable, and that operations on the surface can be very intuitive. For example, it is very easy to change the topology of a level set surface, say by drilling a hole all the way through a solid blob. The connectivity of the surface is implied by the grid, which makes it trivial to fuse different parts of the surface automatically during editing operations. One can take one blob and cleave it into two or any number of pieces, take two or more pieces and join them by drawing a connection between them, and so forth.

But the real power of this sculpting paradigm comes from the operations that can be applied to level set surfaces. Instead of working with them in terms of vertices, edges, or faces, one can simply create new surfaces out of thin air by depositing material, erase existing material, smooth the surface locally by applying a diffusion or low-pass filter to the underlying 3D density field, or pull, push, or mold the surface by advecting the density field along the moving direction of an input device. These are just the four most simple operations, the ones currently implemented in my sculpting application. There are many more of these that could be added.

In more detail, new material is deposited by enumerating all grid vertices that are inside the tool shape, currently a sphere of selectable radius, and setting those vertices’ density values to a constant larger than c. Material is erased by doing the same, but with a constant smaller than c. Doing it straight up, alas, would lead to severe aliasing — the resulting surface would look faceted (see the Leap Motion and Intel demos). This is a very common misconception: most people think that level set surfaces must look faceted, because they are defined on a discrete grid. Not correct; the faceted look is an artifact from doing it wrong.

The trick is to create a “fudge zone” around the actual boundary of the tool shape, where assigned density values inside that zone are not constant, but proportional to any point’s distance from the actual surface. If the “fudge zone” is larger than a few grid voxels, it will effectively remove aliasing. Note the sphere I create in the video; it looks like a perfect sphere, not a sphere made from Lego bricks. Mathematically speaking, adding such a fudge zone will turn the shape function of the tool from an infinite-bandwidth function to a band-limited function, which can be sampled without artifacts. The artifacts only creep back in when the tool shape shrinks to approximately the same size as the grid’s voxels (which I’m showing later in the video, when drilling very small holes).

Oh, and if you’re wondering how the real-time isosurface extraction works: I’m not gonna tell you. Use the source, Luke!

And yet another aside: here is the reason why I open myself up to ridicule by uploading single-take, uncut, unrehearsed, unedited video of work-in-progress software to YouTube (this one is a prime example). User Tiago Estill de Noronha saw the clay modeling video and suggested a better way of managing the editing tool’s different modes, using the joystick’s four primary directions to select between them, in addition to the radio button way. Done, added to the software, released, thank you!

14 thoughts on “Virtual clay modeling with 3D input devices

  1. Pingback: Intel’s “perceptual computing” initiative | Doc-Ok.org

  2. I wanted to try virtual clay modelling with my Razer Hydra.. I installed the latest version of Ubuntu, installed VRUI with your script, plugged in my Hydra and realized that I had more work to do. I see your instructions to get both the Hydra and the Oculus Rift cooperating but I have a sense that I should wait until my Oculus arrives before I configure for that. Any help getting the Razer Hydra up and running with your VRUI example programs would be very appreciated.

    Thanks and Amazing work!

    • It’s pretty easy to get the Hydra working by itself. You need to make changes to the device driver configuration file, VRDevices.cfg in Vrui’s configuration directory ($HOME/Vrui-3.0/etc by default). At the top, uncomment only the line that says “deviceNames (RazerHydra)”. Further down, in the RazerHydra section, find the calibration transformation. This defines the position of the Hydra’s base station in physical space. In the default physical space used by Vrui’s desktop mode, the origin is at the center of your screen and x goes right and z goes up. So if the Hydra’s base station is on your desk, centered underneath the screen, you should change the transformation to something like “translate (0.0, 12.0, -12.0) * rotate (1.0, 0.0, 0.0), 180.0”. Then start the device driver by

      > VRDeviceDaemon

      and wait until it says “Waiting for client connection.”

      Then run any Vrui application like so:

      > AppName -mergeConfig RazerHydra.cfg

      and the virtual handles should show up when you lift them from the base station. Make the Vrui window full-screen for best effect. Once you have the basics working, adjust the calibration transformation such that the virtual handles show up close to the center of the screen when you hold the handles in a comfortable position.

      Note: the Hydra is hard to use unless you have some form of stereo display. If nothing else, get a pair of red/blue glasses and change the window type to “AnaglyphicStereo” in your RazerHydra.cfg by copying and modifying the Window section from Vrui.cfg.

      • First off, Thanks for the great instructions. Its really cool to me that you step up and support people who appreciate your work.

        I was able to open, modify and save VRDevices.cfg. I removed the # in front of Razer Hydra and set the translate to 0.0, 12.0,-12.0,

        I went to Vrui-3.0/bin and tried to execute VRDeviceDaemon by double clicking.. Nothing happened

        I opened the Terminal and typep, cd Vrui-3.0/bin
        I proceeded to try and run VRDeviceDaemon by typing (minus quotes) “> VRDeviceDaemon”. The terminal returns the directory and the result is the VRDeviceDaemon in the bin directory turns into an empty text file..

        I’m not sure what I’m doing wrong here.

        Thanks again!

        Do you have anything to do with MakeVR? I got to play with that at PAX a few days ago and I am deeply encouraged to dive into VR for CAD and model creation.

        Also, I saw your demo showing the integration of the Razer Hydra, Kinect and Oculus. Fantastic! Regarding the inaccuracy of the Razer Hydra, have you tried removing the large metal weights that they put in the handles? I haven’t tried this yet as I am still trying to get off the ground getting things working. Once I’m comfortable configuring VR Devices in your system I want to test the Razer Hydra with and without the weights.. I was told by a rep at Polhemus (original designers of the magnetic motion sensor tech) that metal rings caused magnetic interference.

        Thanks again,

        Isaiah

        • I opened the Terminal and typep, cd Vrui-3.0/bin
          I proceeded to try and run VRDeviceDaemon by typing (minus quotes) “> VRDeviceDaemon”. The terminal returns the directory and the result is the VRDeviceDaemon in the bin directory turns into an empty text file..

          Oops, bad instructions. You weren’t supposed to type the “>”; it stands for the terminal prompt, but I didn’t explain that. I made you overwrite your VRDeviceDaemon executable. No biggie; go to the Vrui source directory (src/Vrui-3.0-xxx in your home directory, and run “make install” again.

          Afterwards, run the device driver by “$HOME/Vrui-3.0/bin/VRDeviceDaemon” (no need to cd into Vrui-3.0/bin). Double-clicking it in the file manager would work, too, but you won’t see any status messages. That’s why it seemed to do nothing when you tried it; it was probably working.

        • BTW, I strongly advise against messing with the Hydra’s internals. If there are metal objects inside the handles, they will be part of factory calibration, i.e., their effect on the magnetic field will be calculated out by what Polhemus documentation refers to as “fixed metal compensation.” If you remove or change the metal parts even slightly, factory calibration will be irretrievably lost, and you essentially just turned your Hydra into a paperweight.

  3. I’m getting closer.. I tried to run the VRdeviceDaemon again and got this error..

    Caught exception USB::Device::open: Insufficient device permissions while initializing

    Thanks

    Isaiah

    • even closer.. I typed in sudo bash (not sure what that is but I saw it on the web)
      After that I ran $HOME/Vrui-3.0/bin/VRDeviceDaemon.
      Terminal says its Warming up the magnetic coils.. bla bla waiting for client connection.. and it waits and waits apparently forever.. Aby ideas?

      • That’s what it’s supposed to be doing. Instead of sudo bash, VRDeviceDaemon it’s easier and safer to run sudo $HOME/Vrui-3.0/bin/VRDeviceDaemon, but the effect would be the same. Once VRDeviceDaemon says “Waiting for client connection” you can start any Vrui application, and it will be able to get tracking data from the Hydra. For example, run ./bin/ShowEarthModel -mergeConfig RazerHydra.cfg (from within $HOME/src/Vrui-3.0-xxx/ExamplePrograms).

        VRDeviceDaemon is a daemon, as in background service process. It offers tracking data to clients, i.e., VR applications, but it’s not a VR application itself.

        • I haven’t heard from you so I though I would repost my current rode block. I think Im almost running Vrui example programs if I can figure out why My -mergeConfig argument is being ignored. Here is what the Terminal looks like when I launch an example Program

          isaiah@isaiah-D900F:~$ $HOME/src/Vrui-3.0-001/ExamplePrograms/bin/ShowEarthModel -mergeConfig RazerHydra
          Vrui::init: Ignoring -mergeConfig argument due to Misc::File: Error opening file RazerHydra in mode rt

          Thanks for the help so far.

          Isaiah

          • Oops, I had it right initially, but made a mistake in my previous comment (fixed now). Do -mergeConfig RazerHydra.cfg instead of -mergeConfig RazerHydra

  4. Getting closer! Here is what I’m getting now.

    I fire up VRDeviceDaemon and it’s waiting for client connection.. This appears to be done from the Vrui-3.0/bin/ directory and not src. I do that as sudo I guess.

    I open a new terminal and it looks like this.

    isaiah@isaiah-D900F:~$ $HOME/src/Vrui-3.0-001/ExamplePrograms/bin/ShowEarthModel -mergeConfig RazerHydra
    Vrui::init: Ignoring -mergeConfig argument due to Misc::File: Error opening file RazerHydra in mode rt

    The example program launches with no working Razer Hydra.

    Any ideas on what is going wrong?

    Thanks,

    Isaiah

  5. Thank you Sir! With your help I managed to finally get the Razer Hydra recognized and showing pointers in the VirtualSculpt example program.

    My next mission is to figure out how to solve the following things.

    My pointers are very small and the Vrui menus that the pointers invoke are tiny. This could be that my screen resolution is to high or that I need to scale the system to live at my resolution. Current resolution is 1920 x 1200.

    I have no idea how to actually sculpt anything. I suppose that there are menus to Set draw size and start drawing. I tried all of the buttons on the Hydra and managed only to find buttons that invoke menus. Most buttons create an error in the terminal. Still, I cant read the menus well enough to make any changes with confidence. My screen looks a lot different than the screen in your demo. My screen is black with a white cage.

    The scaling is the first thing to work on. I feel like the pointers would need to be my full arms width apart in order to be on separate sides of the screen.

    Thanks again for all of the help..

    Isaiah

    • This kind of thing happens when your monitor misreports its own resolution. Due to the way Vrui works, it needs to know your display’s actual size in the physical world. In the default desktop setup, it does that by multiplying screen size in pixels with screen resolution in dots per inch, as reported by X11. If X11 lies, screen size will be wrong and text and icons will be too big or small.

      You can manually set your screen’s size. Open Vrui.cfg and look for “section Screen” and “section Window.” In the former, enter your screen’s actual width and height (in inches in the default setup), and in the latter, set autoScreenSize to false. That should fix things. You can then fine-tune text sizes etc. by editing the ui… settings at the top of “section Desktop”.

Leave a Reply to Isaiah Coberly Cancel reply

Your email address will not be published. Required fields are marked *