Why Desktop Linux Sucks

Now that’s a clickbaity title, you might say, but it’s actually the title of a video I watched the other day:

Linus Torvalds is a fun speaker, so go watch the video. I’ll wait here.

That was the bait, now here is the switch: I don’t think that Linux sucks as a desktop. I have been using Linux as my desktop computing environment since SGI IRIX stopped being a thing, so maybe in 2001, and it’s fine. There have been advances, there have been serious setbacks (Gnome 3, anyone?), but overall it lets me do what I need to do and otherwise doesn’t try to get in my way. I even gave Mac OS X an honest shot when I bought a Macbook Pro in 2008, but it just felt really constraining, so I wiped it after about a month and installed Linux instead and never regretted it. I have a partition with Windows 10 in it on my home computer’s hard drive, but I can’t remember the last time I booted into it.

Side note: How fun it is to have a dual-boot Windows 10 partition to play video games! Hey, I have an hour of free time, let’s play something quick. Okay, shut down Linux, boot into Windows, no problem. Oh, I haven’t booted into Windows in a few weeks, so there is an OS update that needs to be installed. Oh, I can’t skip this and have to wait for it to complete before I can log in. Oh, it took 30 minutes to install the OS update. Well, I guess I’ll switch back over to Linux and try playing a game again in a few weeks or so. At which point there will be another OS update and the cycle repeats.

The Horrors of Distributing Linux Software

But back to the topic at hand. The reason I’m linking this video is that Mr. Torvalds talks about the difficulty of distributing desktop software for Linux, and on that I agree with him 100%. I have created and am maintaining several Linux-exclusive software packages that are used by a significant number of non-technical people, the Augmented Reality Sandbox being the main one. It’s a pretty big piece of software comprising three components: the Vrui VR toolkit, the Kinect 3D video capture package, and the AR Sandbox application itself. Vrui is a general-purpose VR toolkit that covers the gamut from tracking 6-DOF input devices and reprojecting and distortion-correcting rendered images onto an HMD’s display to high-level user interaction and UI widgets. It does a lot more than that, too. The main point is that Vrui is system software, and is therefore deeply tied into the operating system, and relies on a large number of system libraries and interfaces. Which means that packaging and releasing it in binary form is an absolute nightmare. This video spoke to me.

Continue reading

Now This Is Some Exceptional Code

I have been re-writing large chunks of Vrui recently, primarily to support a new Vulkan-based HMD graphics driver that will warp and distortion-correct rendered application image frames to an HMD operating in “direct mode,” i.e., without being managed by the window manager. Yes, I know I’m several years late to that particular party. 🙂

While I was doing that, which involved staring at a lot of old code for extended periods, I also cleaned up some things that had been bugging me for a long time. Specifically, error handling. I like descriptive error messages, because I find they make it easier to pin-point problems encountered by users of my software who are not themselves programmers, like, say, people who install an AR Sandbox at their location. I like it when an error message tells me what went wrong, and where it went wrong. Something like “I’m currently in method Z of class Y in namespace X, and I can’t open requested file A because of operating system error B.” In other words, I want error messages tagged with a location like “X::Y::Z,” and with parameters like a file name or OS error code. I also want to use exceptions, obviously. Unfortunately, C++’s standard exception classes don’t have methods to create exception objects with parameters, so, a very long time ago, I decided to roll my own.

Continue reading

Idle Hands etc. etc.

A friendly redditor sent me this link to a popular post on /r/funny yesterday (see Figure 1 for the picture). I might have mentioned before how it was that exact scene in the original Star Wars movie that got me into 3D computer graphics and later VR, so it got me thinking how that particular shot would have looked like if the miniature ILM used to film the trench run scene had not been flat, but exhibited the proper scaled curvature of the Death Star.

Figure 1: Death Star and trench from attack scene in A New Hope, showing the flat miniature that was used to shoot the scene. Source.

Two hours and 153 lines of code later, here are a couple of images which are hopefully true to scale. I used 160km as the Death Star’s diameter, based on its Wookiepedia entry (Wikipedia states 120km, but I’m siding with the bigger nerds here), and I assumed the meridian trench’s width and depth to be 50m, based on the size of an X-Wing fighter and shot compositions from the movie.

Side note: I don’t know how common this misconception is, but the trench featured in the trench run scenes is not the equatorial trench prominently visible in Figure 1. That one holds massive hangars (as seen in the scene where the Millennium Falcon is tractor-beamed into the Death Star) and is vastly larger than the actual trench, with is a meridian (north-south facing) trench on the Death Star’s northern hemisphere, as clearly visible on-screen during the pre-attack briefing (but then, who ever pays attention in briefings).

The images in Figures 2-6 are 3840×2160 pixels. Right-click and select “View Image” to see them at full size.

Figure 2: Meridian trench on spherical Death Star, approx. 12.5m above trench floor.Horizon distance: 1.4km.
Figure 3: Meridian trench on spherical Death Star, approx. 25m above trench floor.Horizon distance: 2km.
Figure 4: Meridian trench on spherical Death Star, approx. 37.5m above trench floor.Horizon distance: 2.5km.
Figure 5: Meridian trench on spherical Death Star, precisely at Death Star’s surface.Horizon distance: 2.8km.
Figure 6: Meridian trench on spherical Death Star, approx. 100m above Death Star’s surface.Horizon distance: 4.9km.

As can be seen from Figures 2-6, the difference between the flat miniature used in the movie, and the spherical model I used, is relatively minor, but noticeable — ignoring the glaring lack of greebles in my model, obviously. I noticed the lack of curvature for the first time while re-watching A New Hope when the prequels came out, but can’t say I ever cared. Still, this was a good opportunity for some recreational coding.

How to Track Glowing Balls in 3D

Note: I started writing this article in June 2017, because people kept asking me about details of the PS Move tracking algorithm I implemented for the video in Figure 1. But I never finished it because I couldn’t find the time to do all the measurements needed for a thorough error analysis, and also because the derivation of the linear system at the core of the algorithm really needed some explanatory diagrams, and those take a lot of work. So the article stayed on the shelf. I’m finally publishing it today, without error analysis or diagrams, because people are still asking me about details of the algorithm, more than four years after I published the video. 🙂

This one is long overdue. Back in 2015, on September 30th to be precise, I uploaded a video showing preliminary results from a surprisingly robust optical 3D tracking algorithm I had cooked up specifically to track PS Move controllers using a standard webcam (see Figure 1).

Figure 1: A video showing my PS Move tracking algorithm, and my surprised face.

During discussion of that video, I promised to write up the algorithm I used, and to release source code. But as it sometimes happens, I didn’t do either. I was just reminded of that by an email I received from one of the PS Move API developers. So, almost two years late, here is a description of the algorithm. Given that PSVR is now being sold in stores, and that PS Move controllers are more wide-spread than ever, and given that the algorithm is interesting in its own right, it might still be useful. Continue reading

New Adventures in Hi-Fi

I’ve been spending all of my time over the last few weeks completely rewriting Vrui‘s collaboration infrastructure (VCI from now on), from scratch. VCI is, in a nutshell, the built-in remote collaboration / tele-presence component of my VR toolkit. Or, in other words, a networked multi-player framework. The old VCI was the technology underlying videos such as this one:

Figure 1: Collaborative exploration of a 3D CAT scan of a microbial community, between a CAVE and a 3D TV with head-tracked glasses and a tracked controller.

Continue reading

A Blast From The Past

Back in the olden days, in the summer of 1996 to be precise, I was a computer science Master’s student at the University of Karlsruhe, Germany, about to take the oral exam in my specialization area, 3D computer graphics, 3D user interfaces, and geometric modeling. For reasons that are no longer entirely clear to me, I decided then that it would be a good idea to prepare for that exam by developing a 3D rendering engine, a 3D game engine, and a game, all from scratch. What resulted from that effort — which didn’t help my performance in that exam at all, by the way — was “Starglider Pro:”

In the mid to late 80s, one of my favorite games on my beloved Atari ST was the original Starglider, developed by Jez San for Rainbird Software. I finally replaced that ST with a series of PCs in 1993, first running DOS, and later OS/2 Warp, and therefore needed something to scratch that Starglider itch. Continue reading