Oculus finally revealed that Quest will get controllerless hand-tracking input. We got to test the feature at Oculus Connect 6 last week and came away impressed. Hand-tracking will open the door to a range of casual use-cases that will broaden the headset's appeal, but core gaming experiences will still rely on the precision and reliability of controllers. Hand-tracking is coming to Quest early next year, Facebook announced last week at Oculus Connect 6. The appeal is clear: using your hands instead of controllers makes the headset that much easier to use; instead of learning the placement and functions of buttons and sticks, users will be able to throw on the headset and simply point and touch their way around virtual worlds. https://www.youtube.com/watch?v=2VkO-Kc3vks At Connect, Oculus showed off the hand-tracking with a demo called Elixir made by VR studio Magnopus. The setting was fantastical witch's workshop where I got to poke, prod, and play with spells and magic. Quest Hand-tracking Demo It was clear from the outset that the entire demo was designed and built specifically for hand-tracking. The demo revolved around touching, poking, and pinching interactions, but, notably, no 'holding' or direct manipulation of objects except in the case of a pen (which is a very purposeful choice that I'll discuss more later). https://gfycat.com/partialkindemperorpenguin Elixir is a sandbox-ish experience designed to show some of the ways that hand-tracking could be used for gaming interactions. In the experience I was asked to touch my hands to various objects which would turn them from human hands into magical hands which would give me unique powers. For instance I was prompted to touch my hand to something which looked like a hot-plate crossed with grill which caused my hands to turn into fire-infused hands. Then when I waved my fire hands over some candles they were ignited. Another object I touched turned my hands into the hands of a creature with Wolverine-like claws which extended when I made a first. https://gfycat.com/oilyoffensiveamericanshorthair The coolest and most inventive of these hand transformations was when I dunked my hands into a cauldron and they turned into octopus tentacles! The tentacles were longer than my actual fingers and they were physics-enabled, so when I wiggled my fingers the tentacles wobbled about in a gross but oddly satisfying way. Designing Around Haptics Limitations Aside from the hand transformations (and the 'make fist' gestures which powered them), a 'pinch' gesture was used at several points throughout the experience which, from a design standpoint, served as a sort of virtual button press. For instance, hanging above the cauldron was a liquid dispenser with an eye-dropper top which squirted liquid into the cauldron when squeezed. Elsewhere, a miniature bellows could be squeezed with a pinch to brighten a flame. https://gfycat.com/embellishedsmoothbluefish Because the pinching gesture naturally involves touching your own fingers, it provides a sort of self-haptic sensation which feels more natural than pressing a virtual button with no feedback at all. This is the same reason why the magic pen—which was offered to me at the end of the experience to sign my name on a big scroll—was the only object in the experience which could actually be directly held and manipulated. It uses effectively the same pinching gesture as before (mostly replicating a real pen grip), which offers a haptic sensation since you're actually touching your own fingers. https://gfycat.com/slightwarmhatchetfish A good general purpose 'grab' gesture for arbitrarily sized and shaped objects—as has been attempted with gestures like making a fist or a 'C' shape with your hand—has remained quite elusive, as the lack of feedback just doesn't feel quite right (not to mention issues with consistently detecting the pose to prevent items from dropping). That said, even a gesture-based 'grab' that doesn't feel immersive could be perfectly useful for non-entertainment applications where immersion isn't a high priority. Continued on Page 2: Performance & Limitations » Performance & Limitations I was impressed at the stability of the hand-tracking, which almost never had any immersion-breaking hand spasms that sometimes occur when computer-vision based hand-tracking systems aren't quite sure what your hands are doing. I was also very impressed with the system's ability to read some of the more challenging hand-poses, like when the back of my palm obscured my fingers with my arm held out in front of me. Even small movements of my knuckles peering over the top of my hand were reflected pretty well in virtual reality. That said, the system is clearly tuned to aggressively hide any mistaken tracking, and does so by making your hands completely disappear when it's uncertain where they are or what pose they're making. I saw this happen often when I was testing the limits of the hand-tracking system; it especially didn't like when my hands were touching or overlapping at all, and would quickly disappear them. [irp posts="89804" name="Hands-on: Dexmo Haptic Force-feedback Gloves are Compact and Wireless"] In terms of tracking coverage, hands felt well tracked within Quest's visible field of view, but I wasn't able to get a sense for how much further it did or didn't extend from there. When it comes to performance and technical capability, hand-tracking on Oculus Quest felt immersive and entirely up to the task of a casual built-for-hand-tracking experience like Elixir, and surely for operating basic functions of the headset like navigating menus and launching content. Oculus will likely improve hand-tracking over time, and will hopefully be able to dial back the frequency of hand disappearances. Less Friction Means Wider a Wider Appeal for Quest For those who aren't used to gamepads, understanding how to use a VR controller continues to be a major point of friction to using VR effectively. The ability to use the headset simply by pointing and touching with your hands could significantly broaden Quest's appeal and be a major boon for use-cases where the precision and reliability of controllers isn't necessary and where immersion isn't the top priority, like this training experience which Oculus also demoed at Connect: https://gfycat.com/idolizedgravegavial Casual users who want to watch media, browse the web, or join a virtual chat room with friends will be well served by the hand-tracking without bothering with controllers. Similarly, commercial and enterprise uses of VR will make good use of the reduced friction of hand-tracking in places like training, marketing experiences, design review, remote meetings, and much more. Core Games Will Still Rely on Controllers [caption id="attachment_90964" align="aligncenter" width="640"] Image courtesy Oculus[/caption] That said, controllers are very likely to remain the dominant form of input for core gaming experiences because of their precision, reliability, and wealth of inputs. Hand-tracking will work well for casual experiences where users are poking, touching, or pinching, but for games where players will hold and interact with complex objects (like guns, bows, swords, grenades, levers, etc), controllers will continue to offer much more depth. One reason is that, for hand-tracking, the headset only knows your intended input when your hands are clearly visible. This makes it difficult for the system to know if you're choosing to continue to hold an object when your hand moves outside of tracking range. A controller, however, can continue to transmit that the 'grab' button is held no matter if the headset's cameras can see it or not. Another reason is that moving around inside VR environments often makes use of sticks, buttons, or triggers, and in core gaming scenarios, players expect this to be consistent and precise. Without a highly reliable input, moving around quickly and effectively would be challenging. Similarly, many games have a high frequency of inputs (like pulling the trigger of a gun), and these must be 99.9% consistent to prevent frustration. In a game where you might trigger some input (be it buttons, sticks, or triggers) 500 times during a session, even 90% reliability would mean 50 failed inputs—put any controller in a core gamer's hands with buttons that only work 90% of the time and you have a recipe for a broken TV screen! Hand-tracking gestures must be very deliberate in order to be consistently recognized, and it's been a persistent challenge for any computer-vision based hand-tracking technology so far to achieve 99.9% consistency for the sorts of binary 'yes/no' 'on/off' inputs which are used liberally in core gaming experiences. [irp posts="83040" name="Hands-on: HaptX's VR Glove is the Closest I've Come to Touching the Virtual World"] These are just a few reasons why content built for hand-tracking necessitates different design choices than content built for controllers, and why hand-tracking is well suited for casual input scenarios while controllers will continue to see significant use for gaming. Indeed, Oculus itself is not positioning hand-tracking as the next-gen input paradigm designed to replace controllers on Quest. At Connect a member of the team working on hand-tracking told me that they see it as an option for developers, but not a replacement for controllers. Quest is due to get its hand-tracking update sometime in early 2020, and while there haven't been any official announcements yet the company told us it's considering bringing the feature to Rift S as well.