Foveated Rendering is one of the most discussed topics in Virtual Reality, promising to
power the headsets of the future. Here, Tom Sengelaub of SensoMotoric Instruments
gives his insight into the tech behind SMI’s latest project.
Tom Sengelaub is SMI’s Manager Solutions Delivery, leading a team that has pioneered the application of eye tracking in both VR and AR. He lives in Berlin and joined the company in 2008.
Innovation in technology has the biggest impact when driven by specific applications rather than pure spec sheet tuning. As an eye tracking company for 25 years, we understand that an important component of complete immersion is perception.
Foveated Rendering allows us to use our knowledge of human perception to save
significant amounts of computational power. When we look at a fully rendered scene, much of the computational power is wasted because our eyes can only take in fine details at the center of our vision. By understanding exactly where the user is looking, we can render only this area with full resolution.
Foveated Rendering exploits the human visual system when rendering a graphically
intense scene. Of our large field of view, only a small fraction is perceived in high detail.
Anything beyond 5 degrees around our center of attention gradually loses resolution,
due to the different concentrations of cones on the retina – the cells responsible for
seeing color and fine details.
The area with a particularly high density of cones, called the fovea, also corresponds to the point of fixation in our field of view. Our eyes move rapidly (up to 1000 deg/sec), calling for technology that interprets the user’s point of gaze quickly and efficiently. Our fast, accurate and low latency eye tracking provides the user’s gaze almost instantaneously to the rendering pipeline, so that the center of attention always matches the full resolution area in the rendered While visual acuity gradually declines from the fovea towards the periphery, it is impossible to map the corresponding drop in resolution directly to current (mostly) rasterized game engines.
Therefore, we work with rendering zones and assign them a proportional decrease of resolution matching the visual acuity in the field of view of the user. We found that using 3 rendering zones with a relative resolution of 100% at the point of gaze and 60% and 20% for our near and far peripheral view are optimal when rendering an HD scene in an HMD.
The drop in resolution in a scene using Foveated Rendering is obvious until you put on
the headset. If all the parameters are chosen right, the difference between full resolution
and Foveated Rendering is imperceptible, but the benefit is a drop in computational
costs by 2-4x with an HD screen and even higher when we look at 4, 8 or even the 16k
displays we’ll see in future HMDs.
Our thanks to Tom Sengelaub for sharing his insights and thoughts on Foveated Rendering.
Current estimations at which point the human eye is unable to distinguish reality from a display, put the resolution requirements of such a display at around 16k per eye. At that point, the vast amounts of rendering throughput required to generate so many rapidly refreshing pixels seem very likely to benefit, perhaps even require, a solution like foveated rendering to lessen load on GPUs. Rendering only what the eye needs to see, saving on computational and (for the likely evolution to a mobile VR future) battery power, seems an almost inevitable requirement.
Whether Foveated Rendering is beneficial right now, and to what extent, is not quite so clear. We’ve tried SMI’s current Oculus 2 DK2 prototype, and the system works beautifully and (almost) imperceptibly, but hard numbers on how much performance can be gained with the technology when applied to the relatively low resolution, dual display VR headsets found in the consumer Oculus Rift or HTC Vive are as yet unavailable.
Nevertheless, Foveated Rendering with fast, accurate eye tracking is a reality. And that’s almost certainly a good thing for next generation VR headsets, SMI have certainly proven themselves ready for VR’s future.