Quantitative Comparison of VR Headset Fields of View

Although I’ve taken many through-the-lens pictures of several common VR headsets with a calibrated wide-angle camera, until recently I was still struggling how to compare the resulting fields of view (FoV) quantitatively, how to put them in context, and how to visualize them appropriately. When trying to answer questions like “which VR headset has a bigger FoV?” or “by how much is headset A’s FoV bigger than headset B’s?” or “how does headset C’s FoV compare to the field of vision of the naked human eye?”, the basic question is: how does one even measure field of view, in a way that is fair and allows comparison across a wide range of possible sizes. Does one report a single angle, and how does one measure it? Across the “diagonal” of the field of view? What if the field of view is not a rectangle? Does one report a pair of angles, say horizontal⨉vertical? Again, what if the field of view is not a rectangle?

Then, if FoV is measured either as a single angle or a pair of angles, how does one compare different FoVs fairly? If one headset has 100° FoV, and another has 110°, does the latter show 10% more of a virtual 3D environment? What if one has 100°⨉100° and another has 110°⨉110°, does the latter show 21% more?

To find a reasonable answer, let’s go back to the basics: what does FoV actually measure? The general idea is that FoV measures how much of a virtual 3D environment a user can see at any given instant, meaning, without moving their head. A larger FoV value should mean that a user can see more, and, ideally, an FoV value that is twice as large should mean that a user can see twice as much.

Now, what does it mean that “something can be seen?” We can see something if light from that something reaches our eye, enters the eye through the cornea, pupil, and lens, and finally hits the retina. In principle, light travels towards our eyes from all possible directions, but only some of those directions end up on the retina due to various obstructions (we can’t see behind our heads, for example). So a reasonable measure of field of view (for one eye) would be the total number of different 3D directions from which light reaches that eye’s retina. The problem is that there is an infinite number of different directions from which light can arrive, so simple counting does not work.

Solid Angle

Another way of thinking about the problem is to place an imaginary sphere of some arbitrary radius around the viewer’s eye, such that the sphere’s center coincides with that eye’s pupil. Then there is a one-to-one correspondence between 3D directions and points on that imaginary sphere: each light ray enters the sphere in exactly one point. As a result, instead of counting 3D directions, one can measure FoV as the total area of the set of all points on the sphere that correspond to 3D directions which can be seen by the eye.

As it so happens, if the imaginary sphere’s arbitrary radius is set to one, this is precisely the definition of solid angle. If nothing can be seen, i.e., the set of all “visible” points on the sphere is empty, the area of that set is zero. If everything can be seen, the set of visible points is the full surface of the sphere, which has an area of 4π. If only half of everything can be seen, for example because the viewer is standing on an infinite plane, that viewer’s field of view is 2π, and so forth. As an aside, the surface area of a sphere of radius one, without a unit of measurement, is also unit-less, but in order to distinguish solid angle values from other unit-less numbers, they are assigned the unit steradian, or short sr, same as how regular (2D) angles, also fundamentally unit-less, are given in radian (r) or degree (°).

In summary, solid angle is a solid way to measure FoV: it can measure fields of view of arbitrary shapes and sizes in a single number, and there is a direct linear relationship between that number and the amount of “stuff” that can be seen.

Solid Angle and VR Headset Field of View

So far we have talked about field of vision, i.e., how much of a 3D environment a “naked” eye can see. As seen below, that number is important in itself, but the real question is how to measure the field of view of VR headsets. The general idea is the same: calculate how much of a virtual 3D environment can be seen by a user. However, unlike in a real 3D environment, light from a virtual environment does not arrive at the viewer’s eye from all possible directions. Instead, if only arrives from directions that, when tracing them backwards from the eye, go through one of the headset’s lenses, and end up on the display screen behind that lens. FoV is still calculated the same way, but now backwards: FoV is the area of all points on a unit sphere around the user’s eye that correspond to directions that end up on a screen (assuming that that point of the screen is used to show image data by the VR pipeline, but that’s another question).

Fortunately, this area can be measured in a rather straightforward manner. Any camera works by projecting a 3D environment onto an imaging surface (a photoplate or a photosensor), specifically by assigning, to each point on the imaging surface, a 3D direction of light entering through the camera’s focal point. In a calibrated camera, this mapping from image points to 3D directions is precisely known (how it is computed is a topic for another post).

The approach, then, is to place such a calibrated camera, ideally one with a very wide-angle lens, in the same place where a user’s eye would be while the user is wearing a VR headset, and taking a picture of the headset’s screen through one of its lenses. One then looks at each of the picture’s pixels, determines whether a pixel shows some part of the headset’s screen, and sums up the individual solid angles of all pixels that do (that last part is a bit complicated in detail and left as an exercise for the reader). The bottom line, though, is that pictures just like the ones I’ve been taking for a long time are all that’s needed.

Visualizing VR Headset Field of View

Calculating a single solid angle for a given headset is nice for quantitative comparisons, but a picture is often worth a thousand words. How, then, can field of view be visualized in a fair manner? After all, field of view is defined as the area of a part of a sphere’s surface, and as everybody who has ever looked at a world map knows, the surface of a sphere can not be shown on a flat image without introducing distortions. Fortunately, there is a class of map projections that preserve area, meaning, that the area of some region in the projected map is proportional to, or ideally the same, as the area of that same region on the sphere itself. Given that solid angle, or sphere area, is a fair measure of FoV, using such an area-preserving projection should result in a fair visualization: If one headset’s FoV is twice as large as that of another, its FoV will appear exactly twice as large in the picture.

Putting It All Together

For this article, I measured the fields of view of three models of VR headset I happened to have at hand: HTC Vive Pro, Oculus Rift CV1, and PlayStation VR. For context, I also measured the “average” human naked-eye field of vision in the same way, based on an established and oft-cited chart (see Figure 1).

Visual field of human right eye, in equidistant azimuthal projection.
Figure 1: Chart of the visual field of the right eye, from Traquair, H.M., “An Introduction to Clinical Perimetry,” chapter 1, p. 4 (1938), via Wikipedia. The diagram is in equidistant azimuthal projection, not the area-preserving projection used elsewhere in this article.

I traced the outer limit of vision in the diagram (which includes eye movement), re-projected that outline into an area-preserving map projection (see Figure 2), and calculated its solid angle, the combined solid angle of both eyes (assuming the two fields of vision are symmetric), and the solid angle of the intersection of both eyes’ fields of view, i.e., the binocular overlap. The values are as follows: one eye: 5.2482 sr (or 1.6705π sr); both eyes: 6.5852 sr (or 2.0961π sr); overlap: 3.9112 (or 1.2450π sr). I quoted solid angles both as straight steradian and as multiples of π steradian, in case the latter are easier to visualize: 2π sr is a hemisphere, and 4π sr is a full sphere. Interestingly, the combined field of vision of both eyes is slightly more than a hemisphere. While field of vision differs from person to person, these values are average measurements that can be used to put the FoV values of VR headsets into context.

Figure 2: Average human naked-eye field of vision, including eye movement, in an area-preserving projection. Numbers indicate azimuthal angle away from the forward direction. Red: left-eye FoV. Blue: right-eye FoV. Purple: binocular FoV (stereo overlap).

Next, I processed the aforementioned through-the-lens pictures of my three VR headsets in the same way, by tracing the outline of the visible portion of the screen, re-projecting the outline using the same area-preserving map projection, calculating the single-eye, total, and overlap solid angles (see Table 1), and creating diagrams superimposing the fields of view over the average human FoV for context (see Figures 3-5). For each headset, I only used the FoV-maximizing eye relief value, as quoted in the figures. Given that FoV depends strongly on eye relief, one should ideally take a sequence of pictures and tabulate the function FoV(eye relief).

Figure 3: Maximum field of view of HTC Vive Pro (at 8mm eye relief), superimposed over average human FoV using an area-preserving projection. Numbers indicate azimuthal angle away from the forward direction. Red: left-eye FoV. Blue: right-eye FoV. Purple: binocular FoV (stereo overlap).
Figure 4: Maximum field of view of Oculus Rift CV1 (at 15mm eye relief), superimposed over average human FoV using an area-preserving projection. Numbers indicate azimuthal angle away from the forward direction. Red: left-eye FoV. Blue: right-eye FoV. Purple: binocular FoV (stereo overlap).
Figure 5: Maximum field of view of Sony PlayStation VR (at 10mm eye relief), superimposed over average human FoV using an area-preserving projection. Numbers indicate azimuthal angle away from the forward direction. Red: left-eye FoV. Blue: right-eye FoV. Purple: binocular FoV (stereo overlap).

HeadsetSingle-eye FoVCombined FoVOverlap FoVOverlap %
Human Eye5.2482 sr
1.6705π sr
6.5852 sr
2.0961π sr
3.9112 sr
1.2450π sr
59.39%
Vive Pro2.9300 sr
0.9327π sr
55.83%
3.2076 sr
1.0210π sr
48.71%
2.6524 sr
0.8443π sr
67.82%
82.69%
Rift CV12.2286 sr
0.7094π sr
42.46%
2.4982 sr
0.7952π sr
37.94%
1.9588 sr
0.6235π sr
50.08%
78.41%
PSVR2.6042 sr
0.8289π sr
49.62%
2.7275 sr
0.8682π sr
41.42%
2.4808 sr
0.7897π sr
63.43%
90.95%

Table 1: Fields of view of average human (including eye movement) and three VR headsets. Each FoV measurement is given in steradian and multiple of π steradian. FoV measurements for headsets are additionally given in percent of the corresponding measurement for average human FoV.

2 thoughts on “Quantitative Comparison of VR Headset Fields of View

Leave a Reply

Your email address will not be published. Required fields are marked *