At Engadget Expand 2014 in NYC I got to take a look at Fove, a head mounted display with integrated eye-tracking. This is an older prototype made some time ago, but demonstrates the core technology and potential benefits of eye-tracking in an HMD.
Fove is a Japanese company working on a consumer head-mounted display with integrated eye-tracking. The company received some press attention recently thanks to its participation with Microsoft Venture’s Accelerator program.
At Engadget Expand I met with Fove’s CEO, Yuka Kojima, who showed me the company’s first prototype head mounted display. Right now it consists of a lightweight foam housing, two lenses, and two small cameras, on the left and right of the lenses, pointed at each eye. I wasn’t allowed to snap a photo of the inside of the headset, but it appears they’re using all off-the-shelf components in this prototype. The lens and micro display assembly appeared to be torn out of a Sony HMZ headset. The infrared cameras on the left and right had fairly wide angle lenses and are probably taken out of a consumer gesture camera.
Given the 45 degree (or so) field of view, I would not call the existing Fove prototype a ‘VR headset’. That delineation is reserved for HMDs with a larger field of view. However, Fove intends for their finished product to fall into that immersive category; they say their latest prototype uses a 2.5k display and has a 100 degree field of view. Fove says they’re working on positional tracking as well.
Kojima had the Fove HMD connected to a laptop at Expand that struggled to run the experiences that were demonstrated; at 10-15 FPS, it was clear that the company wasn’t too concerned with showing a high fidelity VR experience, rather they just wanted to show their eye-tracking tech. Unfortunately it was unclear if some lag I experienced with the eye tracking was due to the company’s tech, or merely an insufficient computer. We’re soon hoping to get our hands on a the company’s latest prototype with a more powerful computer running the demo.
Kojima ran me through a simple calibration sequence which could be easily gamified. A series of white dots were shown on the display and I was asked to look at each one while she clicked the mouse to save the eye position data for each point. It took about 15 seconds to go through the calibration sequence and when it was over I was asked to look at a green dot in the center of the screen. If the calibration went well, another dot, representing my gaze would land on or very close to the center dot. If it was miscalibrated, the dot would be far away. It took about three tries to calibrate properly; when we did get a good data set, the tracking seemed very accurate, with the gaze dot landing right on the center point.
A quick glimpse at Fove’s view of your eye and the calibration process.
After calibrating I asked Kojima if I could tighten the headset but she somewhat frantically told me not to, for fear or throwing off the calibration. This makes me wonder if users will need to frequently re-calibrate when removing the headset for breaks or making adjustments. So long as the calibration process can become faster and more robust, it won’t be a problem, but if users need to recalibrate every time they make a comfort adjustment, it might present an issue to usability.
Headtracking was built into the headset but there isn’t positional tracking on this prototype. However, the company has said that they’re working on positional tracking.
Kojima walked me through a few different experiences demonstrating the eye tracking capabilities of the Fove HMD. The first had me in a dark city street with some futuristic-looking super-soldiers lined up before me. Looking at them caused me to shoot them and one after another they dropped to the ground after being blasted by my eyes.
Another demo was essentially the same thing but this time there were ships flying all around me. Looking at them caused me to lock onto them, then a tap of the mouse shot them out of the sky. While I wouldn’t call this kind of eye-based aiming particularly compelling, I suppose it’s a step up from aiming with a reticle attached to your face—something we see today across a number of Oculus Rift demos.
The next demo was closer to the applications that eye-tracking would be most useful for—gaze-based focus and character interaction. In the next scene I was standing in front of a virtual character in an open field. When I glanced at the background, the foreground blurred. When I glanced at the character, she smiled and came into focus while the background blurred.
The same mechanics could be used for foveated rendering, where the image is only rendered in high fidelity at the very center of the eye, while areas further away from that point are rendered with lower quality to reduce the computing power needed to render the scene (as we only see sharply in a very small area). This may be necessary to push framerates in virtual reality to 90 FPS or more, which is what Oculus believes is necessary.
If Fove is going enable foveated rendering, their eye tracking tech needs to be extremely fast, lest you glance over to another part of the scene and see it blurred for a moment until the system finds your new gaze direction—a sure immersion breaker unless done fast enough. As I mentioned previously, I experienced noticeable lag in the eye-tracking, though it wasn’t clear if this was inherent or due to an inadequate computer.
All in all, Fove’s tech works well as a proof of concept. Their next steps will be important: showing that they can succeed in the other important aspects of a VR headset, like comfort, latency, low-persistence, and positional tracking, and also convincing the world that eye-tracking is necessary for a VR headset (I believe that it is). If Fove can pull it off, their innovation will likely be more about pricing than heretofore unseen tech; like VR headsets of yore, eye-tracking is not new, it simply hasn’t been seen at a consumer price point. If and when it hits, it could be a game changer.