Better Social Avatars
Most social VR applications today appear to show users with realistic eye movements, including blinking, saccades, and object focus, but all of it is faked using animations and programmed logic. This illusion is good for making avatars appear less robotic, but of course the actual nonverbal information that would be conveyed when truly face-to-face with someone is lost.
Accurate eye-tracking data can readily be applied to VR avatars to actually show when a user is blinking and where they’re looking. It can also unlock both conscious and unconscious nonverbal communication like winking, squinting, and pupil dilation, and could even be used to infer some emotions like sadness or surprise, which could be reflected on an avatar’s face.
Meta has been pushing the boundary on social avatars with its Quest Pro headset which features both eye-tracking and mouth-tracking, bringing much greater authentic expression to virtual avatars.
Intent & Analytics
Eye-tracking can also be very useful for passively understanding player intent and focus. Consider a developer who is making a horror game where a player wanders through a haunted house. Traditionally the developer might spend a long time crafting a scripted sequence where a monster pops out of a closet as the player enters a certain area, but if the player isn’t looking directly at the closet then they might miss the scare. Eye-tracking input could be used to trigger the event only at the precise moment that the user is looking in the right direction for the maximum scare. Or it could be used to make a shadowy figure pass perfectly by the player but only in their peripheral vision, and make the figure disappear when the user attempts to look directly at it.
Switchback VR on does something even more creative with eye-tracking and horror thanks to PSVR 2—in certain areas of the game there a haunted mannequins that only move when you blink…
Beyond just using eye-tracking to maximize scares, such passive input can be used to help players achieve greater precision in their virtual environment. In Horizon Call of the Mountain on PSVR 2, for instance, the user’s gaze is used as a sort of ‘auto aim’ to help make long distance bow shows more accurate.
Tobii, a maker of eye-tracking hardware and software, shows how the same concept can be used to improve the accuracy of throwing in VR. By inferring where the user intends to throw an object based on their gaze, the system alters the trajectory of the thrown object to a perfectly accurate throw. While the clip below shows the actual vs. the corrected trajectory for demonstration purposes, in actual usage this is completely invisible to the user, and feels very natural.
Beyond this sort of real-time intent understanding, eye-tracking can also be very useful for analytics. By collecting data about what users are looking at and when, developers can achieve a much deeper understanding of how their applications are being used. For example, eye-tracking data could indicate whether or not users are discovering an important button or visual queue, if their attention is being caught by some unintended part of the environment, if an interface element is going unused, and much more.
Active Input
Eye-tracking can also be useful for active input, allowing users to consciously take advantage of their gaze to make tasks faster and easier. While many XR applications today allow users to ‘force pull’ objects at a distance by pointing at them and initiating a grab, eye-tracking could make that quicker and more accurate, allowing users to simply look and grab. Using eye-tracking for this task can actually be much more accurate, because our eyes are much better at pointing at distant objects than using a laser pointer from our hands, since the natural shakiness of our hands is amplified over distance.
Similar to grabbing objects, eye-tracking input is likely to be helpful for making XR fast and productive, allowing users to press buttons and do other actions much more quickly than if they had to move their body or hands to achieve the same. You can bet that when it comes to XR as a truly productive general computing platform, eye-tracking input will play a major role.
Healthcare & Research
And then there’s a broad range of use-cases for eye-tracking in healthcare and research. Companies like SyncThink are using headsets equipped with eye-tracking to detect concussions, purportedly increasing the efficacy of on-field diagnosis.
Researchers too can use eye-tracking for data collection and input, like getting a look at what role gaze plays in the performance of a professional pianist, better understanding autism’s influence on social eye contact, or bringing accessibility to more people.
– – — – –
Given the range of potential improvements, it’s clear why eye-tracking will be a game changer for AR and VR. While eye-tracking is today available only in premium headsets, eventually the tech is likely to trickle down to become an industry-standard feature.