Here is an interesting innovation: the developers at Cloudhead Games, who are working on The Gallery: Six Elements, a game/experience created for HMDs from the ground up, encountered motion sickness problems due to explicit viewpoint rotation when using analog sticks on game controllers, and came up with a creative approach to mitigate it: instead of rotating the view smoothly, as conventional wisdom would suggest, they rotate the view discretely, in relatively large increments (around 30°). And apparently, it works. What do you know. In their explanation, they refer to the way dancers keep themselves from getting dizzy during pirouettes by fixing their head in one direction while their bodies spin, and then rapidly whipping their heads around back to the original direction. But watch them explain and demonstrate it themselves. Funny thing is, I knew that thing about ice dancers, but never thought to apply it to viewpoint rotation in VR.
This is very timely, because I have recently been involved in an ongoing discussion about input devices for VR, and how they should be handled by software, and how there should not be a hardware standard but a middleware standard, and yadda yadda yadda. So I have been talking up Vrui‘s input model quite a bit, and now is the time to put up or shut up, and show how it can handle some new idea like this.
Vrui has a large number of navigation metaphor plug-ins, and several of those use controller-based viewpoint rotation and could potentially benefit from this new “VR Comfort Mode” (I wonder if that moniker is going to stick now). But to keep things interesting, I didn’t implement it exactly the same way Cloudhead Games did — they replaced smooth motion using an analog stick with discrete steps every time the stick is pushed to either extreme — but instead implemented it for the canonical “mouse look + WASD” navigation method from first-person games. This added several interesting wrinkles.
For one, with mouse look there is no discrete button-like event to initiate a discrete turn; instead, smooth motions to the left or right need to be translated into a sequence of discrete steps, and, very importantly, the total angle of rotation after stepping must match the total angle of rotation imposed by the user, or experienced players who can make precise turns from muscle memory would be very upset. The second wrinkle is mouse aim. When there is no 6-DOF input device for free aiming, and players do not want aiming reticles that are glued to their faces, it still must be possible to point the reticle in any arbitrary direction, even if viewpoint rotation itself is quantized. And if there is some form of compass HUD indicating viewing or pointing direction, that has to work seamlessly as well.
Turns out, all this was a piece of cake. It only took five lines of code or so in Vrui’s FPSNavigationTool class to enable quantized azimuth rotation with freely configurable quantization steps, and due to Vrui’s architecture, it now works across all Vrui applications. For posterity’s sake, I recorded myself trying it for the first time in the Rift after compiling the modified tool for the first time:
And here is how I did it: the FPSNavigationTool class, like all other tool classes derived from SurfaceNavigationTool, applies azimuth rotation by rotating physical space around the local “up” direction by the azimuth angle selected by the tool. Now, instead of applying azimuth angle directly, I quantize it using the simple qAzimuth = floor(azimuth/step + 0.5)*step formula. That’s all it takes; the only thing left is to correct the direction of the aiming reticle and compass HUD by applying a correction rotation of azimuth-qAzimuth around the Z axis during HUD rendering, and tada.
My first impression is that this new method works. I don’t suffer noticeably from rotation-induced motion sickness, but I did feel a difference. What remains now is to implement the same minor change in the other navigation tools, where applicable, and test quantized rotation on as many people as possible.