Qualcomm has debuted an updated version of their VR Headset Reference Design now with Leap Motion’s new 180-degree hand-tracking to bring gesture control to mobile VR headsets. The new headset and Leap Motion tracking module was shown off during last week’s GDC 2017.
Qualcomm’s VR Headset Reference Design has been upgraded to the company’s new Snapdragon 835 mobile platform. The purpose of the headset, which the company calls the VRDK (Virtual Reality Development Kit), is to act as a foundation for Qualcomm’s device partners to make their own VR headsets based on Qualcomm’s mobile computing hardware.
The VRDK is an impressive mobile headset itself, featuring a 2560×1440 AMOLED display and inbuilt inside-out positional tracking derived from internal sensors and a pair of 1280×800 front-facing cameras. We tried out the positional tracking earlier this year and found it to be quite performant.
But when it comes to mobile VR, where the goal is to have a single, self-contained unit that doesn’t rely on external tracking sensors or beacons, Leap Motion may have found a perfect fit; hand-tracking is more immersive than the limited rotation-only controllers that we see with Daydream and others (like the newly announced Gear VR). Having the tracking be totally on-board also means one less piece of equipment to tote around, helping to keep mobile VR portable and easy to use.
Leap Motion identified this sweet spot a while back and has been teasing a new mobile solution that would address the field-of-view limitation that came from strapping the company’s pre-VR device onto VR headsets. The company formally announced the mobile made-for-VR module in late 2016, and now we’re seeing the first glimpses of integration into Qualcomm’s newest VRDK, which I got to try out at GDC 2017 last week.
The increased tracking field of view is bolstered by smart tweaks to the hand-tracking software; such that if I was holding an object and then turned my head (causing the object to truly leave the tracking module’s field of view) the software would remember that I was holding that object (and in which hand) once it came back into view, and often identify my hand holding the object before it came back into the headset’s own field of view, making a big improvement from the compelling-but-frustrating experience of the original desktop module.
Continue Reading on Page 2 >>
Page: 1 2