Tobii, a global leader in eye-tracking, announced earlier this year that it was in talks with Sony to include its tech in the upcoming PlayStation VR2. Now the company has confirmed its eye-tracking is integrated in PSVR 2.

Update (July 1st, 2022): Tobii has officially announced it is a key manufacturer of PSVR 2’s eye-tracking tech. The company says in a press statement that it will receive upfront revenue as a part of this deal starting in 2022 and revenue from this deal is expected to represent more than 10% of Tobii’s revenue in 2022.

“PlayStation VR2 establishes a new baseline for immersive virtual reality (VR) entertainment and will enable millions of users across the world to experience the power of eye tracking,” said Anand Srivatsa, Tobii CEO. “Our partnership with Sony Interactive Entertainment (SIE) is continued validation of Tobii’s world-leading technology capabilities to deliver cutting-edge solutions at mass-market scale.”

The original article follows below:

Original Article (February 7th, 2022): Tobii released a short press statement today confirming that negotiations are ongoing, additionally noting that it’s “not commenting on the financial impact of the deal at this time.”

It was first revealed that Sony would include eye-tracking in PSVR 2 back in May 2021, with the mention that it will provide foveated rendering for the next-gen VR headset. Foveated rendering allows the headset to render scenes in high detail exactly where you’re looking and not in your peripheral. That essentially lets PSVR 2 save precious compute power for more and better things.

Founded in 2001, Tobii has become well known in the industry for its eye-tracking hardware and software stacks. The Sweden-based firm has partnered with VR headset makers over the years and can be found in a number of devices, such as HTC Vive Pro Eye, HP Reverb G2 Omnicept Edition, Pico Neo 2 Eye, Pico Neo 3 Pro Eye, and a number of Qualcomm VRDK reference designs.

It’s still unclear when PSVR 2 is slated to arrive, although it may be positioned to become the first true commercial VR headset to feature eye-tracking—that’s if PSVR 2 isn’t beaten out by Project Cambria, the rumored ‘Quest Pro’ headset from Meta which is also said to include face and eye-tracking.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Well before the first modern XR products hit the market, Scott recognized the potential of the technology and set out to understand and document its growth. He has been professionally reporting on the space for nearly a decade as Editor at Road to VR, authoring more than 4,000 articles on the topic. Scott brings that seasoned insight to his reporting from major industry events across the globe.
  • MeowMix

    TBH, I assumed the PSVR2 would be using Tobii eye tracking. Tobii is probably the only major independent eye tracking solution. The other big players (Google, META, Apple, Microsoft, etc) all have their own R&D in XR including eye tracking. META Reality Labs has a ton of eye tracking research, and they’ve acquired ET companies such as The Eye Tribe.

    That said, it is interesting SONY is considering using a 3rd party solution for their eye tracking. Maybe their inhouse solution isn’t up to the task.

    • Christian Schildwaechter

      It may simple be hard to replace long years of experience. Tobii has been working on eye tracking for a long time, released a number of products themselves or with partners and got a lot of real world feedback. Not sure if Sony has released an eye tracking product yet or if so far it has been mostly a lab project. We know that eye tracking is hard, partly due to the individual eyes of people being very different, and even Tobii still not having a solution that works for every user.

      If money or company size were the only factors for successfully implementing tracking, hand tracking on the Quest should run circles around any other solution. In reality Ultraleap’s hand tracking proves to be significantly better, even though the products of the original developer Leap Motion were largely unsuccessful and the company had to be sold in 2019 to what is now called Ultraleap for just 10% of its peak value. So a small company with little money, no success, but a lot of experience, can still produce a superior solution.

  • Lhorkan

    Well, there goes the hoped $399 out the window.

    • kool

      This seems like the opposite I’m sure it’s cheaper than making your own.

  • Andrew Jakobs

    “First true commercial VR headset”
    Uhm, no it isn’t.. unless all the other headsets like Vive Pro Eye aren’t commercial headsets, oh wait they are….

    • Sven Viking

      Yup. “First mass-market headset” might be better.

  • Bob

    If the launch date of PSVR 2 is slated for the end of 2022 (pre-order notifications are already up on the official website), it does seem awfully late for Sony to start picking out partners to provide the eye tracking solution. Perhaps Tobii is assisting with the software side of things and helping with the tracking algorithm? There’s always pumping up stock prices too by announcing things at the right moment…

    • Andrew Jakobs

      Nah, not really, with the PS4 and PS5 they also were putting the final specs together in februari and they had a release in november. It might just they have all the tech already together but are now just finalizing the actual licenses for production.

  • F*** Tobii. They have all of the eye tracking patents and they’ve been stopping eye tracking from ending up in any reasonably priced VR headset. They’ve been playing gatekeeper to what is otherwise a simple set of tiny cameras and a little software. It’s $5 in parts! We’d have had eyetracking in the CV1 Rift if it wasn’t for their greed!

    • sfmike

      The magic of predatory capitalism.

      • Christian Schildwaechter

        Two sided coin: it also allowed Meta to spend billions on VR and selling us the Quest 2 pretty much at production costs. Of course they don’t do this only to quickly increase the market size, but also to push others out of the market and make even more billions later.

        I’m not a fan of overly wide patents, esp. for software, and believe that the tendency to turn federal patent offices into profit centers is a horrible idea. But I’ve worked in clinical research and know how insanely expensive development can be, and so far I haven’t been able to come up with a better solution than time limited monopolies in the form of patents. That also applies to Tobii, who have been working on eye tracking since 2001.

        But patents aren’t supposed to exclude competitors, they come with the obligation to be licensed at “fair terms”. What is really missing is a way to force patent holders to apply these fair terms instead of going the Qualcomm way, basically blackmailing everybody and counting on simply being able to afford getting sued for a longer time than your customers can afford suing you, forcing them into very unfavorable conditions/settlements.

  • Kenny Thompson

    Lots of open questions regarding PSVR eye tracking…. John Carmack has been quite skeptical about eye tracking as it relates to foveated rendering… So it’s highly likely there is some hitch in the tech-stack. Hope they can make it go. The promise is amazing.

    • silvaring

      So all this Nvidia talk about Dynamic Foveated Rendering being much better thanks to eye tracking is all bollocks?

      • Cless

        It probably improves performance, just don’t think of it as the Hail Mary of VR performance.

        • kool

          I wonder how much of a performance gain fov rendering will actually achieve. Even if its 100% you still have to account for the extra frames and tracking. I’m curious to see how far cambria will push that old soc. I’m also curious to see if fov rendering, wifi6 and cloud gaming can make vr game streaming a reality!

          • Rosko

            Well If you try fixed FR it doesn’t gain that much. I would say much of the motivation to include this expensive tech is about the tracking data gained rather than the performance. I may be wrong i hope i am.

  • Sven Viking

    Reposting as including a link has had the original comment stuck awaiting moderation for the past 10 hours:

    Wait, surely they can’t be just starting to negotiate deals for what tech to put into the headset at this point?? What does that mean for estimates that the headset is launching this year and possibly already in manufacturing?

    Even if they put eye-tracking cameras in there with no specific plan for software, and are only now looking for software providers to make use of that predetermined hardware, it seems crazy to me to publicly announce eye-tracked foveated rendering as a feature before having a deal to make it possible?

    Edit: SadlyItsBradley says “The hardware design is finalized. If I had to guess: Sony attempted to use their inhouse algorithms for their IR camera based eye tracking and wasn’t getting the overall reliability they wanted. So they started asking Tobii for help”

    Someone on Reddit also made the point that avoiding potential issues with Tobii’s patents might be a factor.

  • Christian Schildwaechter

    Some update about the performance benefits of eye tracking to match the update on Tobii tech being used:

    In March Android Central leaked some numbers from a Unity presentation at GDC2022, where they achieved “up to” a 260% performance increase with eye tracking plus dynamic foveated rendering, compared to a 150% increase with only fixed foveated rendering. These were most likely best case scenarios. Testing it on the popular, very render heavy VR Alchemy Lab demo, there was still a 130% combined improvement, while on rendering a 4K spaceship demo GPU performance improvements shrank to 14%, with CPU performance being 32%.

    So eye tracking clearly offers performance benefits, but not as much as originally hoped, where people expected 90% less rendering being required, which would be mean eye tracking plus foveated rendering would equal to 1000% the performance of only the GPU rendering at full resolution. Previous YouTube videos from Tobii from a few years ago have shown about 40% frame rate improvements, but it was unclear if these where best case or average results.

    The effect is somewhat diminished by fixed foveated rendering already providing a boost without the need of eye tracking. In Unity’s best case PSVR2 scenario eye tracking effectively improved frame rates by 44% on top of existing foveated rendering. Meta says that fixed foveated rendering on Quest 2 provides on average a 25% performance boost.

    • wheeler

      Even if it’s “just” 30% or 50%, that’s still huge. It’s just that people were hyped up on something that was unrealistic (just off by a mere factor of 20), as with many other advances that would supposedly happen by now (for example, varifocal headsets that should have supposedly come last year, but now are estimated to be another 5 to 6 years out–“hopefully”). I’ll leave the attribution of that irresponsible hype as an exercise for the reader

      • Hymen Cholo

        The percentage improvements will be more pronounced as FOV increases. The PSVR only has 110 degree FOV. In a say 200 degree FOV system, the difference between all that peripheral screen being rendered at full resolution, vs foveated should be much more substantial and is when this technology will be necessary.

        • wheeler

          Perhaps at some point, but massively increasing FOV doesn’t appear to be a priority for these companies and for good reasons. They’re still stuck on simply getting the image to look correct for the current ~100 degrees directly in front of the user (there are still tons of problems that remain here) and dramatically increasing FOV will only make those optical challenges significantly harder (never mind just reaching the hyped up 140 degrees)

          • david vincent

            Also more FOV = more potential VR sickness

      • Christian Schildwaechter

        It has some consequences for how useful eye tracking could be in mobile headsets. With the PS5 being a computational power house, the extra CPU/GPU resources needed for eye tracking probably don’t amount to much, so taking a 30% or 50% frame boost is a no brainer. On the other hand mobile GPUs like the Adreno 650 in the XR2 are much more limited and require using simpler geometry, textures and as few light sources as possible at a lower FoV, which somewhat reduces the potential of optimization by foveated rendering. And adding eye tracking will have a much higher relative computational cost.

        When Meta showed their new Codec Avatars in May, they also presented custom hardware for eye/face tracking, esp. a low power neural processor that was integrated into mobile SoC and could hold the complete encoder model adapted for reconstruction a face from partial facial scans. That is necessary because they track the eyes with cameras inside the HMD, the mouth with cameras outside the HMD, and all the “deep fake” models we have used so far for applying a trained image onto a more generic model required having the full face recorded. From their paper is wasn’t clear how much of the extra power required was for the eye tracking and how much for the avatar matching.

        So eye/face tracking will be useful on PSVR2, but it may require extra hardware on e.g. Cambria to not overstrain the already limited power budget.

        • Sven Viking

          I’d expect mobile devices to eventually include dedicated chips for eye tracking processing rather than running it on the main processor.

    • Andrew Jakobs

      Foveated rendering is just a patch to get some improvement. The best way is not to have any need for foveated rendering or that ‘spacewarp’ like algorithms. But sadly at this point of time GPU power (or transmission in regard to wireless) is just too low to not need it.
      But I’m glad it is provided on the PSVR2 as it will increase the perceived visual fidelity of games instead of having to further tone down the settings.

      • Christian Schildwaechter

        Well, it is a very clever patch, as it uses the way our retinas work. We only ever see a small part of the world, and even that small part depends a lot on where we look. The fovea that foveated rendering is named after is actually a very tiny spot in the center of the retina with ~1.5mm diameter, tightly packed with (color vision) cone cells, and only there we achieve the best vision.

        If you look anywhere but straight forward, you experience what is basically natural fixed foveated looking, with the resolution immediately reducing. And on the retina itself, the (slower) color cones are denser towards the center, with the (faster) rod cells for brightness detection becoming more dominant towards the edge. Our brain just reconstructs a high res image of the world from all the different patches.

        Even if we could see the same sharpness in every direction, the extra rendering required for the peripheral vision is rather inefficient, as computation increases exponentially with FoV. But combined with the physiology of the human eye it is pretty much completely wasted. So eye tracking with dynamic foveated rendering is a nice trick, one that can only work in XR and could allow VR games to catch up to the render requirements of pancake games on the same hardware.

        And pretty much everything in real time graphics is a cheat anyway, be it frustum culling, which is effectively a very simple form of fixed foveated rendering, occlusion culling ignoring hidden objects, or level or detail rendering simply replacing distance objects with low poly versions or sprites. No GPU we have is even remotely capable of rendering even a tiny aspect of reality in 3D voxel space, all our gaming and VR is basically running on a stage with flat parts painted to look real, when in reality it is just a hollow illusion.

        • Andrew Jakobs

          Yeah I understand, but it’s not like our real world is also ‘high resolution’ outside our sharp vision. And at this point of time, eye tracking (or even full body) is still not as fast as how fast we can move our eyes.

          Ofcourse our graphics are major cheats, but in most cases you won’t notice those cheats if you move your eyes beyond the speed the eye tracking can track.
          But for now I’m glad these extra cheats using eye tracking can improve the perceived fidelity of the current generation graphics.

          • Sven Viking

            Saccadic masking means it doesn’t need to be as fast as one might think for large eye movements, and the stop location can apparently be predicted significantly in advance during deceleration of the eyeball. Sounds like near-100%-reliable accuracy may be an intractable problem for now though.

      • Sven Viking

        Once GPUs become powerful enough to allow for better graphics without performance “patches”, you’ll still be able to have even better graphics by applying those patches. Even if we somehow reached a sci-fi GPU singularity where anything you wanted to render could be rendered, performance patches would provide power and thermal advantages.

    • Sven Viking

      on the popular, very render heavy VR Alchemy Lab demo, there was still a 130% combined improvement, while on rendering a 4K spaceship demo GPU performance improvements shrank to 14%, with CPU performance being 32% better.

      Admittedly that’d still open the opportunity for people to make games that would previously have been unreasonably render-heavy.

      Also, yes, references to it being only xx% better than Fixed Foveated Rendering tend to fail to mention that it should also (theoretically) eliminate the major disadvantage of FFR.

      • Christian Schildwaechter

        I’d expect eye tracking to become another tool that developers have to learn how to use most effectively. All new optimizations have allowed to create more complex worlds, you just have to understand the limits and create levels accordingly. Unity has announced they will provide eye tracking heat maps in their game engine, allowing developers to see where people typically look, and design for that. A scene where the player searches for something on a desk might crank up the foveated rendering and add more detail to items on the desk, while a scene where narrative information is given might have to reduce details and foveated rendering, because most players start to quickly look around the room while listening.

        As with any other tool, experience will make it more powerful over time. One of the benefits of the very long console lifecycles has always been that developers learned to squeeze out everything of the hardware, making games from the end of the cycle way more impressive than launch titles. So far we don’t even know how people will react to the eye tracking being too slow. Maybe the artifacts will be too annoying, limiting the usefulness. Or maybe people will not even notice them when they are corrected within a few ms, and we can crank it way up.

        Last November a Sony patent turned up, describing a type of foveated rendering where the geometry complexity is reduced instead of the amount of pixels to be rendered. That might solve some problems with artifacts, or prove to be too difficult to implement. We’ll know in a few years, and by then I’d expect eye tracking to provide average improvements better than what Unity currently got for their PSVR2 best case scenarios.

        • Sven Viking

          Yeah there are at least two main things that could make artifacts noticeable — inaccuracy/latency of the eye tracking or the method used for rendering the low-quality areas of the frame. Your peripheral vision might not require high resolution but there are a lot of problems it could notice easily, like crawling or shimmering artifacts. Should all get better with time.

        • silvaring

          Interesting observations re: scene details being based on the objects themselves. And we somehow don’t think that data harvesting and profiling is not going to be a big part of algorithms that predict where we will look next? Its such a slippery slope… because clearly the eye tracking technology in PSVR2 will simply not be capable enough on a hardware level to deal with the rapid eye movements / saccades of human vision operating normally. Instead of improving the hardware and then rolling it out in a safe and privacy enabling manner they will probably instead fall back on cheaper software implementations using big data. Not sure how I feel about this tbh.

          • Christian Schildwaechter

            Privacy will be a big issue. Not only will eye tracking allow to create profiles of triggers that a particular user reacts to, it will also allow designing scenes esp. to manipulate users into doing things without them being aware. Both game design and advertising already do this to some extend, but eye tracking and AR/VR/XR in general provide for a much better feedback loop.

            Simply prohibiting it isn’t really an option, as this type of individualization can also be extremely useful for the user, for example a game can notice that a player is stuck because s/he missed a visual hint and help by providing another. We see something similar with access to the passthrough video stream: Meta never allowed developers to access the camera data, but AFAIR HTC initially did as long as the user actively agreed, and then removed it completely from later versions of their SDK. This kills a whole lot of useful applications, like trying to reconstruct objects in the room automatically, or using fiducial markers on real objects to track them in VR, which Varjo uses for some nifty tricks.

            It is a slippery slope. A possible solution could be early on transparency with a clear way for users to see which data will be processed only locally, which remotely, what type of data is send back, how it is used and how long it will be stored. Probably way too complex for most people, so just like with tracking cookies most will go with either “accept all” or “reject all”, never really understanding the risks and/or benefits of eye tracking. And I doubt that Meta would be interested in raising users awareness that a lot of tracking is already happening, given that all the financial trouble they just got in comes from Apple (and to a lesser extend Google) now blocking their excessive tracking by default.

    • david vincent

      Foveated rendering is just a hack for now. Once it will become an incontrovertible standard (indispensable to tackle higher resolutions and FOV), we can hope that new versions of current 3D engines will be built around it. And later new non raster engines (think foveated path tracing) able to make it shine.

  • Biggest name? More like biggest Patent Trolls! They’ve been destroying every other player in the market. This hardware should cost less then $5 and be in all headsets. It’s JUST a pair of little cameras! Instead these Tobii creeps have been trickling it out to a few select headsets, usually at an obscene premium.