SMI, a company working in the field of gaze detection for over 20 years, hit CES this year with an application of their latest 250hz eye tracking solution coupled with a holy grail for 2nd generation virtual reality – FOVeated rendering.

SMI‘s (SensorMotoric Instruments) history with eye tracking is lengthy, having been on the cutting edge of the field for over 2 decades now. Up until now however, their technologies have been used in any number of fields, from consumer research – gathering how people’s eyes are drawn to particular products in a supermarket aisle all they way to informing the optimal design for eye wear for sporting professionals.

FOVeated-rendering-SMI-2

At CES this year, SMI were at the show to demonstrate their latest 250Hz eye tracking solution integrated with a VR headset. More importantly however, they demonstrated this eye tracking coupled with Foveated rendering, a technology that is generally regarded as vital for next generation VR experiences.

Foveated rendering is an image rendering technique which is born from the way we look at and process images from the world around us. Although human vision has a very wide total field of view, we really only focus on very small segments of that view at any one time. Our eyes rapidly dart from point to point, drawing information from those focal points. In the world of VR that means that, at least in theory, most of the pixels used to render the virtual world to a headset at a constant resolution is largely wasted. There’s not a lot of point drawing all of those pixels percentage of them at any one time.

SEE ALSO
Sony Teases Something from 'Starship Troopers' Coming to PSVR 2

Foveated rendering aims to reduce VR rendering load by using gaze detection to tell the VR application where the user is looking and therefore which area of the view to construct at high visual fidelity. Allowing the rest of the image, which falls into our peripheral view to be drawn at lower resolutions, the further out from the current focal point it is. The technique is largely accepted as necessary as we strive to achieve an image which is imperceptible to the human eye from reality, an image that requires a resolution in the region of 16k per eye for a 180 degree field of view, according to Oculus’ chief scientist Michael Abrash. That’s a lot of potentially wasted pixels.

I met with SMI’s Christian Villwock, Director OEM Solutions, who showed me their latest technology integrated with a modified Oculus Rift DK2. SMI had replaced the lens assembly inside the headset, with the custom headset incorporating the tech needed to see where you were looking. (We’ll have a deep dive on exactly how this works at a later date).
SMI-DK2-Eye-tracked (5)

Firstly, Villwock showed me SMI’s eye tracking solution and demonstrated its speed and accuracy. After calibration (a simple ‘look at the circles’ procedure), your profile information is stored for any future application use so this is a one-time requirement.

The first demo, comprises a scene will piles of wooden boxes in front of you. A red dot highlights your current gaze point, with different boxes highlighting when looked at. This was very quick and extremely accurate, I could very easily target and follow the edge of the boxes in question with precision. The fun bit? Once you have a box highlighted, hitting the right joypad trigger causes that box to explode high into the air. What impressed though was that, as the boxes rose higher, I could almost unconsciously and almost instantly target them and continue the same trick, blasting the box higher into the air. The system was so accurate that, even when the box was a mere few pixels across at 100s of feet in the air, I was still able to hit it as a target and continue blasting it higher. Seriously impressive stuff.

SEE ALSO
Quest 3S Review – Value That Can't Be Beat, With the Same Rough Edges as Its Siblings

SMI-DK2-Eye-tracked (2)

The best was yet to come though, as Villwock moved me to the their piece de resistance, Foveated rendering. Integrated into the, by now, well-worn Tuscany tech demo from the Oculus SDK, SMI’s version is able to render defined portions of the scene presented to the user at varying levels of detail defined as concentric circles round the current gaze point. Think of this like an archery target, with the bullseye representing your focal point, rendered at 100% detail, with the next segment 60% detail and the final segment 20% detail.

There were a couple of questions that I had going into this demo.

One: Is the eye tracking to Foveated pipeline quick enough to track my eye, shifting that bullseye and concentric circles of lower detail fast enough for me not to detect it? The answer is ‘yes’, it can – and it really does work well.

Two: Can I detect when Foveated rendering is on or off? The answer is ‘yes’, but it’s something you really need to look for (or, as it happens, look  away for). With SMI’s current system, the lower detail portions of the image are in your peripheral vision, and for me they caused a slight shimmering to appear at the very edge of my vision. Bear in mind however this is entirely related to the field of view of the image itself and how aggressively that outer region is reduced in detail. i.e. it’s probably a solvable issue, and one that my not even be noticed by many – especially during a more action-packed VR experience.

SEE ALSO
Quest 3S vs Quest 3 vs Quest 2 Compared with Detailed Specs

The one thing I could not gauge is of course the very thing this technology is designed to resolve – how much performance was gained when engaging Foveated rendering versus 100% of the pixels at full fidelity. That will have to wait for another time, but cannot be ignored of course – so I wanted to be clear.

So, much to my surprise, Foveated rendering looks to already be within the grasp of second generation headsets. Christian told me that they’re discussing implementations with hardware vendors right now. It does seem clear that, for the second generation of VR headsets, and if we ever hope to reach those resolutions which allow imperceptible display artefacts, eye tracking is a must. SMI seem to have a solution that works right now, which puts them in a strong position as R&D ramps up for VR’s next gen in a couple of years.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Based in the UK, Paul has been immersed in interactive entertainment for the best part of 27 years and has followed advances in gaming with a passionate fervour. His obsession with graphical fidelity over the years has had him branded a ‘graphics whore’ (which he views as the highest compliment) more than once and he holds a particular candle for the dream of the ultimate immersive gaming experience. Having followed and been disappointed by the original VR explosion of the 90s, he then founded RiftVR.com to follow the new and exciting prospect of the rebirth of VR in products like the Oculus Rift. Paul joined forces with Ben to help build the new Road to VR in preparation for what he sees as VR’s coming of age over the next few years.
  • Sebastien Mathieu

    our freshly pre-ordered almost acquired CV1 are already obsolete……

    • towblerone

      I have no issues at all being an early adopter. In 2-3 years I’ll sell my CV1 and buy CV2.

    • JoeD

      every fresh piece of tech is already obsolete the minute you buy it. That’s the nature of technology. Things move much too fast to have the best at any given time.

    • polysix

      cv1 is already obsolete even without new advances thanks to sticking with a crappy gamepad and seated experience (welcome to 2 years ago – DK2 owner here bored of that now). Vive at least is starting down the right path from DAY 1.

      Anyway, yes can’t wait for eye tracking and wireless in gen 2/3 (along with more FOV and high res of course)

      • yag

        Actually Valve is going to fast, we need wireless headsets before even considering roomscale, or it will be only a gimmick.

        • Peter

          2 years after your comment… I just got my first headset since the DK1 (which I only used about twice), a vive. it’s fantastic, and I’ve used it every day for the past 2 weeks (since I’ve gotten it). yes there is a cable, but it’s minimally-annoying. roomscale is fantastic. sending the amount of information it needs to send to your PC, wirelessly, would be a huge technical challenge, plus where are you going to put that heavy, fat, battery pack that would surely be required?

  • Tom VR

    Incredible. Wonder if it tracks your blinking, and if it does, can blinking be used as a tool in-game…

    • towblerone

      Imagine horror games that know when your eyes are closed because you’re too afraid to look? NPCs could taunt you and then when you open your eyes, BAM! there could be some monster in your face.

  • VR Sverige

    Yeah just love this tech. I saw an early prototype 2 years ago from Tobii and think/hope they a pretty much on the same path as SMI. They were wery secretive when asked about future tech at that time.

  • Jeri Haapavuo

    FOVE has done this already. Google it.

    • yag

      No foveated rendering with the FOVE.

  • francoislaberge

    Is foveated rendering really the best use of eye tracking? What about controlling the virtual directions of your eyes for proper convergence when focusing on closer things, did they have demos of that too?

    • brandon9271

      This is a stretch but they could also move the lenses with servo motors so that focal length would match the distance of the object you are focusing on. However, It would be complicated and probably not be worth the trouble. Nevertheless I’d be interested to see somebody experiment with it.

  • yag

    250hz is a great improvement compared to the actual cheap eye trackers @60hz.
    But I always read that you need at least 1000hz to do foveated rendering ?…

  • Jonny Cook

    Question for the author of this article: Do you know if the Tuscany demo had auto-aiming enabled? You mentioned that you were able to hit the barrel even when it was only represented by a few pixels on the screen. Is it possible that this was due to some sort of auto-aiming mechanism, and not a good indication of the technologies accuracy?

    • Pre Seznik

      He says he could very accurately follow the edges of the boxes with the red dot representing his gaze. That doesn’t sound like auto aim.

  • VR Geek

    I think we will see this tech in a consumer HMD this year. It might take 2-3 years to be in every HMD, but no more as it makes such a big difference in so many areas. VR needs it to the point that it will be a standard feature by the end of this year, CV2 will have it I bet.

    • Andrew Jakobs

      Don’t count on it, this year you’ll only see the Vive, the CV1 and the PSVR, and maybe some chinese knockoff’s, but no other real consumer headsets..

  • bar10dr

    What if you move your eye during a frame render. For me this only makes sense with a super high frame render.

    • pixelblot

      That’s like saying what if you move your head during a frame render.. Obviously it is a low latency system..240hz dude..our monitors today refresh at 60-76 standard..100-120 high end 140hz top of the line.

  • T.O.

    I’m not sold on the implications of this. on peripherals is sometimes necessary for watching multiple entry points in counterstrike for example (obviously not the target for VR but still) VR is obviously a great tool and has a lot of developmental practicality but stuff becomes genuinely hard to run when its interact-able. this technology is not necessary unless the games are hard to run which means its targeted towards very detailed games even though that market is the ones who use peripheral view the most. – “especially during a more action-packed VR experience.” definitely have a problem with this statement. It seems disingenuous when talking about first person mediums.

    • pixelblot

      Huge implications actually. When you only have to render 10-15% of your (tracked)viewing area at full resolution and the subsequent outer edges at lower resolution then that means you gain so much more performance. Better quality games, more framerate = win, not to mention direct eye contact interactivity.