Someone at Oculus is Reading my Blog

I am getting the feeling that Big Brother is watching me. When I released the inital version of the Vrui VR toolkit with native Oculus Rift support, it had magnetic yaw drift correction, which the official Oculus SDK didn’t have at that point (Vrui doesn’t use the Oculus SDK at all to talk to the Rift; it has its own tracking driver that talks to the Rift’s inertial movement unit directly via USB, and does its own sensor fusion, and also does its own projection setup and lens distortion correction). A week or so later, Oculus released an updated SDK with magnetic drift correction.

A little more than a month ago, I wrote a pair of articles investigating and explaining the internals of the Rift’s display, and how small deviations in calibration have a large effect on the perceived size of the virtual world, and the degree of “solidity” (for lack of a better word) of the virtual objects therein. In those posts, I pointed out that a single lens distortion correction formula doesn’t suffice, because lens distortion parameters depend on the position of the viewers’ eyes relative to the lenses, particularly the eye/lens distance, otherwise known as “eye relief.” And guess what: I just got an email via the Oculus developer mailing list announcing the (preview) release of SDK version 0.3.1, which lists eye relief-dependent lens correction as one of its major features.

Maybe I should keep writing articles on the virtues of 3D pupil tracking, and the obvious benefits of adding an inertially/optically tracked 6-DOF input device to the consumer-level Rift’s basic package, and those things will happen as well. ๐Ÿ™‚

No, I’m not entirely serious. Magnetic yaw drift correction and position-dependent lens distortion correction are obvious features to add to the SDK (I mean, yaw drift correction is the primary reason IMUs have magnetometers in the first place). They are not banging rocks together over at Oculus HQ. But one can dream, can’t one?

And here is an unrelated picture, so that reblogging “AIs” don’t have to scrape my commenters’ avatars for thumbnails:

Figure 1: A photograph of IDAV’s 18’x9′ six-tile head-tracked stereoscopic display wall, showing as default background image a raytraced C-60 Buckminsterfullerene molecule floating over a checkerboard for no particular reason.

13 thoughts on “Someone at Oculus is Reading my Blog

  1. hehe, here’s the thing: You talked about pupil tracking, and then FACEBOOK BOUGHT OCULUS. Advertising companies (like Facebook) are very interested in gaze tracking.

  2. Aaaaw, so my avatar won’t show up now? My reason for commenting has been lost! Haha, no not really. And whoa, didn’t know Vrui had its very own solution for tracking ๐Ÿ˜ฎ it’s easy to just assume most things use the Oculus SDK!

    As for Oculus uhm… monitoring… your site. Well, they have so many people on board now so there are probable several of them doing the monitoring! So yes, yes please, talk about features we need implemented ๐Ÿ˜‰

    I’m especially interested in what you mention, a single 6-dof hand tracker. Ever since reading Sebastien Kuntz article about presence, and that one tracked hand is the baseline, I’ve kind of been hoping that this is what Oculus will announce next! Placing the camera above, looking down at 45 degrees, it feels like it would be quite straight forward to also track a hand with it. Perhaps the tracking volume would lose it too easily.

    Oh, and we all know now that Big Brother is actually Facebook, and what that supposed IR camera is for! ร’_รณ oooooh, the bright future. It wasn’t actually very long ago that I read 1984, and then watched the movie, haha.

    • From now on I’ll include completely off-topic pictures into posts that don’t have any otherwise, to throw off scoop.it and their ilk. ๐Ÿ™‚

      But regarding your question: since Oculus already use a camera to track the Rift, they are in monopoly position, so to speak, of being the only ones that can support optically-tracked input devices without requiring a second camera. It would make no business sense not to exploit that.

      That VR doesn’t really work without a 1:1 tracked 6-DOF device is a no-brainer; the only thing that concerns me is that having a “standard device” is not good for VR as a whole at this point (it’s definitely good for Oculus). Developers need to be forced away from writing code for a single device, or set of devices from a single company, and towards using high-level middleware, or innovation will be stunted before we really figure out what the input modalities should be.

      I really need to write about that.

      • Absolutely!

        You know, as a consumer my perspective was that… if they include a controller, everyone will have it, and developers would not hesitate adding support for it! 100% attachment rate! I did back the STEM but with PrioVR and whatever will come next…. yeah, things like MiddleVR seems more attractive now ๐Ÿ˜ฎ

        I know that what put me off of MiddleVR when they used it for VR Lemmings was that it was a separate app I had to boot up before the actual demo, but perhaps it can also be implemented transparently when there is more time. Lemmings was done during a jam so I don’t know ๐Ÿ˜›

        • Not sure how Sebastien implemented it, but as MiddleVR is — as far as I know — set up as a set of plug-ins for Unity and other game/rendering engines, having the separate app might be necessary to get those plug-ins served to the core engine.

          I should also write about the similarities/differences between Vrui and MiddleVR at some point, as both approach the same overall problem in different ways. It could be an instructive object lesson in software architecture, and MiddleVR is really neat. Oh, if I only had the time.

        • Hi,
          Actually you don’t have to launch MiddleVR before your game. You also can create a shortcut or a default configuration file to be able to just run your exe! ๐Ÿ™‚

      • This made me realize something, Facebook, of all things Facebook, is gonna have a camera pointing at you at all times when you are by the computer…

        I hope they’re at least polite enough to do something like puting a little LED in series with the image sensor so that you always know when the camera can see you (AKA “all the time”), regardless of what changes are made to the firmware or driver…

  3. Why a chequerboard? Because how else would people know you’re running a Big Grunty Raytracer rather than just being a Rasterising Peasant!

Leave a Reply to Simon Wood Cancel reply

Your email address will not be published. Required fields are marked *