smi eye tracking smart glasses
SensoMotoric Instruments (SMI), based in Germany, recently introduced new 3D glasses capable of real-time eye tracking. These are active-shutter glasses like you’d find in use with a home 3DTV, but they also support eye tracking and have attachable markers for headtracking, making them ideal for a Cave Automatic Virtual Environment (CAVE). The glasses could be used to enhance such virtual reality experiences by adding an additional method of input, providing valuable usability metrics, and enabling more precise calibration of displays.

In addition to presenting the glasses earlier this week the SD&A trade show in San Francisco, California, the company introduced the new glasses (which don’t seem to have a name other than ‘SMI Eye Tracking Glasses’) in a video:

The glasses themselves do nothing but track user gaze, which is achieved with two small cameras pointed at the users eyes from the rim. Addons enable the active shutter for stereoscopic 3D and attachable markers are used for headtracking with an external system.

Oliver Kreylos is a PhD virtual reality researcher who works at the Institute for Data Analysis and Visualization, and the W.M. Keck Center for Active Visualization at the University of California, Davis. He maintains a great blog on the topic. He tells me that the SMI eye tracking glasses could be used calibrate CAVEs more accurately from one user to the next:

Currently, we only know the position and orientation of the glasses, and assume the position of the pupils (which we actually need) based on that. I usually try to measure my own eye position when wearing a set of head tracked glasses, and leave that as the default for others. But there are differences, inter-ocular distance is only one of them, and being able to measure them automatically whenever someone puts on the glasses would lead to even more stable holographic displays. Also, tracking the pupils during use would allow us to fine-tune the display’s eye positions depending on how much users have their eyes crossed.

Perhaps even more exciting is the ability to use eye tracking as means of input.

SEE ALSO
Google Announces Android XR Operating System Alongside Samsung MR Headset

“You could use gaze direction as a ray-based additional input device, say for object selection. There still needs to be a way to trigger an event, but that could be done via voice commands,” Kreylos said.

Currently, Kreylos has virtual reality CAVE systems wherein the user can grab onto holographic objects using a tool held in either hand. With the SMI eye tracking glasses, it would be possible to naturally select up to three objects at the same time; two with hand tools and the other just by looking at it.

Furthermore, gaze tracking could be use as an invisible input that would be useful for game developers. For instance, you wouldn’t want a daemon to jump out in a horror game you until the user is looking exactly where the developer wants. This can be achieved to some degree by tracking the player’s reticle, but there’s no guarantee that the position of the reticle matches the player’s gaze.

iron man suit interface hud

Like the HUD in Iron Man’s suit, eye tracking could also be used to make more natural interactions with a user interface. HUD elements in your favorite FPS could be minimized on the side of the player’s view until looked at. For instance, you might look down at a mini-map to expand it, or look at an enemy and see contextual information like Batman’s detective mode (excuse the superhero theme):

batman-detective-mode

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."