There is an on-going, but already highly successful, Kickstarter campaign for a new VR head-mounted display with a wide (200°) field of view (FoV): Pimax 8k. As I have not personally tried this headset — only its little brother, Pimax 4k, at the 2017 SVVR Expo — I cannot discuss and evaluate all the campaign’s promises. Instead, I want to focus on one particular issue that’s causing a bit of confusion and controversy at the moment.
Early reviewers of Pimax 8k prototypes noticed geometric distortion, such as virtual objects not appearing in the correct places and shifting under head movement, and the campaign responded by claiming that these distortions “could be fixed by improved software or algorithms” (paraphrased). The ensuing speculation about the causes of, and potential fixes for, this distortion has mostly been based on wrong assumptions and misunderstandings of how geometric projection for wide-FoV VR headsets is supposed to work. Adding fuel to the fire, the campaign released a frame showing “what is actually rendered to the screen” (see Figure 1), causing further confusion. The problem is that the frame looks obviously distorted, but that this obvious distortion is not what the reviewers were complaining about. On the contrary, this is what a frame rendered to a high-FoV VR headset should look like. At least, if one ignores lenses and lens distortion, which is what I will continue to do for now.
To understand the issue with geometric distortion, we briefly need to recap what VR is supposed to do: create a virtual environment that looks real, in some meaning of that word, to the viewer. Boiled down to its essence, this means that if any virtual object is supposed to appear at a certain position (x, y, z), the viewer should see the same object at those same coordinates. As our brains don’t naturally work in Cartesian coordinates, this requirement can be rephrased using polar coordinates: If an object is supposed to appear, from the viewer’s current position, at angles of (φ, θ) relative to the viewer’s current gaze direction, then it should appear at those same angles inside the headset. If this is done correctly, virtual objects appear in the correct places, and do not appear to shift, warp, or stretch as viewers move around or rotate their heads. If this is not done correctly, virtual objects, including the entire virtual environment, will not appear stable, causing a break of immersion or even nausea.
The way to fulfill this requirement is to project virtual objects onto a headset’s screens using the headset’s internal geometry, i.e., real-world sizes of its screens, and their real-world position and orientation relative to the viewer’s eyes. If the geometry is represented accurately, and the projection done correctly, the illusion will hold.
In other words, understanding how a concrete VR headset, such as Pimax 8k, does its projection requires knowledge of its internal geometry, which is something we don’t currently have. Nonetheless, we can still investigate the principles by creating a reasonable model of the headset based on known specifications. Concretely, we know that the headset has a (claimed) combined horizontal FoV of 200°, that its display panels are 120mm wide (3840 pixels at 806 ppi), and that the two screens are angled with respect to each other. Based on that, Figure 2 shows a diagram how the headset could look like internally. While lenses are very important for head-mounted VR, I will follow the campaign’s lead and ignore them for the time being. They don’t change the principle of how projection works.
Note that this diagram is just an illustration. Pimax 8k’s binocular overlap may be smaller or larger than 80°, and therefore its per-eye FoV may be smaller or larger than 140°, and, with lenses added, the geometry might look quite different. The point is: based on this diagram, what would the rendered frames look like? Figures 3 and 4 show left/right image pairs produced from projecting an environment (a Quake ||| Arena level) onto screens laid out as in Figure 2.
Figure 3 is a frame where the player is standing in a narrow tunnel, back pressed against one tunnel wall, facing the opposite tunnel wall. Importantly, the straight wall behind the player’s back shows up on the left and right edges of the image pair, which proves that the total field of view is in fact larger than 180°.
Figure 4 shows what appears to be horrible geometric distortion. The health sphere in the lower-right hand corner looks like a highly-elongated ellipsoid, and the left and right views of the tunnel entrance look completely different from each other. On the whole, each half of this image pair looks very similar to Figure 1. Interpreted as a stereo image, it would appear impossible to fuse these views into a coherent 3D scene. But the surprising fact is, if this image pair were actually displayed on a headset with the internal geometry from Figure 2, it would look 100% correct (but very blurry, as we are, to repeat it, forgoing lenses).
The fundamental problem here is that these images were rendered for a per-eye field of view of 140°, but are currently displayed, thumbnail-sized, inside a web browser on a regular computer monitor, which would typically have an actual field of view of less than 40°. It is this discrepancy between rendered and actual field of view that causes apparent distortion. When rendered and observed FoV match, the distortion vanishes. You’ll have to take my word for it. OK, that’s probably not going to happen, so here’s a crude experiment: I rendered a frame from the same Quake III Arena environment at 170° FoV, projected it onto my office wall at a size of 90″x50.6″, and then took a series of photographs (see Figures 5, 6, and 7 — apologies for the low image quality).
The photo in Figure 5 was taken from a distance of approximately 7′. The rendering appears highly distorted, especially the health sphere on the right, and the tunnel entrance left of center. In this case, the actual field of view of the image, seen from the camera, was approx. 56°.
The photos in Figures 6 and 7 were taken from a distance of about 4″, where the actual FoV closely matched the rendered FoV (I did the best I could free-hand). The health sphere now looks like a sphere, and the tunnel entrance looks like a real tunnel entrance.
The cause of this apparent distortion is that the mapping from pixels on a flat rectangular screen to viewer-relative polar coordinate angles is non-linear. Imagine a screen with a fixed width w and actual horizontal FoV φ, yielding an implied viewer-screen distance of d = w / (2 * tan (0.5 * φ)). Quick check: for w=90″ and φ=170°, this yields a viewer distance d = 90″ / (2 * tan 85°) = 3.94″, matching Figures 6 and 7.
Then, a pixel on the screen at position x in the range from -w/2 to +w/2 appears at angle α = tan-1 (x/d), which is a non-linear relationship. Turned around, this means that pixels towards the edges of the screen cover smaller angles than those in the center, or, conversely, the same angle α covers more pixels, which causes stretching (see Figure 8). Viewed from the correct distance d, this stretching is precisely canceled out by perspective foreshortening, due to the edges of the screen being farther away from the viewer than the center, and being viewed at an oblique angle.
So, if the apparent distortion exhibited in Figure 1 is not the problem, what were the reviewers complaining about? Here is a statement I made earlier: “The way to fulfill this requirement is to project virtual objects onto a headset’s screens using the headset’s internal geometry, i.e., real-world sizes of its screens, and their real-world position and orientation relative to the viewer’s eyes. If the geometry is represented accurately, and the projection done correctly, the illusion will hold.”
The problem is the bolded part. The reviewers were testing the headset with VR software running on Valve’s SteamVR platform. In order to work with multiple HMD models, SteamVR receives a specification of the currently connected HMD’s internal geometry directly from that HMD’s firmware during SteamVR’s initialization. However, the current specification data format does not contain fields for screen rotation angles. As a result, SteamVR, and by extension all SteamVR applications, currently assume that an HMD’s left and right screens are always parallel to each other.
This assumption is wrong in Pimax 8k’s case, which causes two problems. The big one is that SteamVR’s internal HMD representation does not support HMDs with a total field of view of 180° or more. Consider Figure 2: the FoV for each eye is formed by the triangle defined by that eye’s pupil position and the left and right edges of its associated screen. The closer each pupil is to its screen, the bigger the FoV. At the limit, when the pupil touches the screen, FoV approaches 180°. As there is no way to push the pupil through the screen, FoV can not be made any larger. If per-eye FoV is limited to less than 180°, and the two screens are parallel, combined FoV must also be less than 180°. This in turn means that SteamVR software can not render the full FoV required by the HMD’s screen layout, and since there is also no way for an HMD to tell SteamVR to only render to part of its screens, SteamVR will take the reduced FoV and stretch it across the real FoV, which causes visible distortion not only in the part of periphery that extends beyond 180°, but everywhere.
Second, even if Pimax 8k had a total FoV of less than 180° (or were able to tell SteamVR to ignore parts of its displays), and SteamVR were told to render to the correct FoV, there would still be distortion, specifically, keystone distortion, because SteamVR’s projection would be based on parallel screens, while Pimax 8k’s screens are rotated with respect to each other.
The proper way to fix both of these problems is for Valve to add screen rotation angle fields to SteamVR’s HMD specification protocol, so that SteamVR can set up correct projections. This would most probably not even require changes to existing SteamVR applications, as those are only receiving general projection matrices, which can already represent rotated screens.
Until then, there is a work-around Pimax could use to at least confine geometric distortion to the outer edges of the periphery, beyond 170° or so. I have been ignoring lenses and lens distortion thus far, but now they come in handy. SteamVR already has a very flexible way for HMDs to communicate the optical properties of their lenses, concretely, the formulae needed to pre-distort rendered images to correct for the lenses’ non-linear geometric distortions and chromatic aberrations, to the SteamVR run-time. The stretching resulting from FoV truncation, and the keystone distortion resulting from wrong screen orientation, while not in fact caused by the lenses, can still be baked into the lens distortion correction formulae and fixed that way.
Breaking update: A new method (IVRServerDriverHost::TrackedDeviceDisplayTransformUpdated) just appeared in OpenVR’s low-level hardware interface. This seems to be either hot off the press or even on-going, as the associated interface version number hasn’t even been changed yet. I’ll need to investigate how exactly that new method is used, but it could in principle serve to address the angled screen issue I’ve discussed here. Thanks to redditor /u/AD7GD for pointing it out.
Is there a place in OpenVR sdk where you can tell it to render to two separate displays and treat them a one coherent unit? I know on the VIVE and Oculus this is done via HW but its gotta be possible since the InfiniteEye folks did it.
I hardly know anything about the SDK side of OpenVR. I’m working with the driver interface side only.
What is the distortion going to be like for a program like VorpX that doesn’t rely on steamvr? It allows up to a foot of 160 degrees on flat svreens.
If source engine game that doesn’t use steamvr but let’s you change the fov up to 170 degrees, what is it going to look like on a pimax8k?
Good question. The biggest problem for something like Vorpx might be that the field of view on Pimax 8k is highly asymmetric, meaning there is a lot more view on the outside of the forward direction than on the inside. Non-VR games probably assume a symmetric view frustum. If you apply a symmetric 160° frustum to a screen that is supposed to be highly asymmetric, you will get a lot of distortion even if the total FoV matches. I do not know if Vorpx can override a game’s internals to this degree.
The same holds with the angle between the left and right screens. It must be taken into account when rendering, and again, non-VR games should not be set up to handle it. It would fall to Vorpx, but it should be even harder to pull off than just the high FoV. There is no way to represent a total binocular FoV of >=180° as a standard frustum, as used for frustum culling or view-dependent rendering. This would lead to objects and textures popping in and out of view at some point inside the field of view, as many have reported from games like Fallout or Skyrim, but in Pimax 8k’s case it would be very hard to fix.