Independent ‘Orion’ Video Shows Massive Improvements in Leap Motion Hand Tracking

4

This week Leap Motion released ‘Orion’ a brand new made-for-VR hand tracking engine which works with their existing devices. Time and again we’ve seen promo videos from the company showing great tracking performance but then found it to be less than ideal across the broad range of real-world conditions. Now, an independent video showing the Orion engine in action demonstrates Leap Motion’s impressive mastery of computer vision.

Leap Motion has always been neat and it’s always worked fairly well. Unfortunately in the world of input “fairly well” doesn’t cut it. Even at 99% consistency, input becomes frustrating (imagine your mouse failing to click 1 out of 100 times). And this has presented a major challenge to Leap Motion who have taken on the admittedly difficult computer vision task of extracting a useful 3D model of your hands from a series of images.

leap-motion-orion-blocks-demo
See Also: Leap Motion Launches Overhauled Hand Tracking Engine That’s Made for VR

The company has been working for nearly a year on a total revamp to their hand tracking engine, dubbed ‘Orion’, which they released this week in beta form for developers. The new engine works on existing Leap Motion controllers.

A video from YouTube user haru1688 shows Orion running on the Leap Motion controller mounted to an Oculus Rift DK2. Importantly, this video was not staged for ideal conditions (as official videos surely are). Even so, we see impressively robust performance.

[gfycat data_id=”MindlessOrderlyAssassinbug” data_autoplay=true data_controls=false]

Hands are detected with incredible speed and seemingly immune to rapid movement. We see almost no ‘popping’ of fingers, where the system misunderstood what it was seeing, often showing the 3D representation of the fingers in impossible positions. There’s also now significantly less loss of tracking in general, even when fingers are occluded or the hands are seen edge-on.

SEE ALSO
New 'Batman: Arkham Shadow' Trailer Reveals Gameplay, Quest 3 Graphics, & Fall Release Date

Clapping still seems to be one area where the system gets confused, but Leap Motion is aware of this and presumably continuing to refine the interaction before moving Orion out of beta.

Compare the above video to earlier footage by the same YouTube user using a much earlier version of Leap’s hand-tracking (v2.1.1). Here we see lots of popping and loss of the hands entirely.

[gfycat data_id=”GiantTepidDutchsmoushond” data_autoplay=true data_controls=false]

The original Leap Motion hand tracking engine was designed to look up at a user’s hands from below (with the camera sitting on a desk). But as the company has realized the opportunity for their tech in the VR and AR sectors, they’ve become fully committed to a new tracking paradigm, wherein the tracking camera is mounted on the user’s head.

Because the tracking engine wasn’t made specifically for this use-case, there are “four or five major failure cases attributed to [our original tracking] not being built from the ground up for VR,” Leap CEO Michael Buckwald told me.

“For Orion we had a huge focus on general tracking and robustness improvements. We want to let people do those sorts of complicated but fundamental actions like grabbing which are at the core of how we interact with the physical world around us but are hard to do in virtual reality,” he said.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • user

    well the main problem i see with these things is: you can only act with your hand if you see them. you cant actually throw something like you are used to do.

    but this might change if you have a tracking system which is seperated from the headset, or a second one which could assist.

    • Kai2591

      Yeah exactly. Also, ‘grabbing’ something from behind (say, a virtual ‘gun’) either over the shoulder or behind the waist.etc would not be possible.

      But hey tech continues to be better I’m sure there will come a perfect solution :)

      • yag

        Oculus or Valve will probably find a better solution before that…

        • Chris Wren

          Keep in mind this is hardware from 2012. The fact that we can still see massive improvements from just the software update is amazing. If you’ve got a DK2 and Leap, check out: https://www.wearvr.com/apps/rainbow-jellies