Engine Modifications Made in Unreal

This is where it’s about to get a bit more technical… buckle up!

We’re currently using Unreal Engine 4.27.2 for Myst. Meta has some base code for hand tracking in Unreal—but not enough for a game like ours that requires better gesture and confidence detection. Much of Meta’s gesture detection and physicality libraries for hand tracking are only available in Unity at this time, so we needed to do some ground work on that front to get simple gestures like our ‘thumbs-up’ and ‘finger pointing’ gestures recognized in Unreal.

Additionally, there are some other elements folks will need to implement themselves for a shippable hand tracking project, which I’ll detail below.

System Gesture Spam Fix

The Oculus left-hand system gesture (this is for the menu button) will trigger even if you begin a pinch, instead of waiting to confirm the pinch has been in the same state for a period of time. We fixed this by changing the event in the Oculus Input library to wait for the pinch to complete (wait for the system gesture to fill in its confirmation circle) before firing off a notify event instead of doing so while it’s in progress.

Multi-Platform Build Stability

The Oculus Hand Component being serialized with any Blueprint will cause builds for other platforms (such as Xbox) to break during the nativization step. This is because the Oculus Hand Component is a part of the OculusVR plugin, which is only enabled for Windows and Android and therefore can’t have any of its components referenced in Blueprints when other platforms are built.

Nativization isn’t officially supported in Unreal Engine 5, but for folks in Unreal Engine 4, it may still be beneficial to keep it enabled depending on your project’s needs. Therefore, it isn’t feasible to include a hand component at the Blueprint level for games that are packaged for all platforms.

SEE ALSO
Meta Would Happily do the “heavy lifting” to Get AirPlay on Quest, if Apple is Game

Our solution to this is that we’re only spawning and destroying the Oculus Hand Component in C++ on our Touch controllers whenever hand tracking is detected as enabled or disabled, and this functionality is only enabled for Android builds built for Quest. Hand Component source and all of our hand tracking code is excluded from all other platforms.

Unfortunately, this means that if you’re a developer with a Blueprints-only project that’s targeting multiple platforms and making use of nativization in Unreal Engine 4 and you’re considering implementing hand tracking for Quest, you may have to convert your project to a source project in order to avoid nativization issues building for platforms other than Quest.

Custom Whole-Hand Gesture Recognition

There’s no meaningful whole-hand-based gesture recognition built into the Oculus input library (other than finger pinches for system gestures) in Unreal. This means that if you make a thumbs-up gesture or a finger-pointing gesture that requires all other fingers to be tucked in, there isn’t anything built into Unreal that notifies you of that specific gesture happening.

Our solution for this was to implement our own bone rotation detection in the Oculus Hand Component, with adjustable tolerances, to infer when:

  • A finger point (with the index finger) is occurring
  • A grab is occurring
  • A thumbs-up gesture is occurring

All of them get fired off as input events that we can bind to in C++, which is where we house most of our base player controller, character, and Touch controller code.

Gesture & Tracking Stability Adjustments

When implementing and testing Hand Tracking support for Myst in Unreal, we noted some quirks with the tracking stability for certain fingers when they’re occluded by the rest of your hand. For example, if you’re:

  • Grabbing something with all of your fingers facing away from you
  • Pointing your index finger directly away from you

In the case of grabbing something with all of your fingers facing away from you, we noted that hand tracking may occasionally think that your pinky finger isn’t enclosed in your fist, as if it had been relaxed slightly. In fact, tracking accuracy of all fingers when in a closed fist with your fingers obfuscated by the back of your hand isn’t particularly high, even when it doesn’t consider tracking confidence to be low. This is an issue for when we expect the player to grab onto things like valves or levers and proceed to turn/move them without letting go too quickly—the hand tracking system might decide you’re no longer grabbing the object because it thinks you relaxed your fingers.

SEE ALSO
Meta Swiftly Pulls the Plug on 'Marvel Powers United VR' Fan Revival Project

In the case of pointing your pointer finger away from you, sometimes the hand tracking system would consider your middle finger to be more relaxed than it actually was, or even pointing with your pointer finger as well. This was an issue for our navigation and movement systems. If the system no longer thinks you’re pointing with just your pointer finger, it might instead think you’re trying to grab something and stop you from moving, or unwittingly execute a teleport you weren’t ready to initiate.

Our solution for both of these scenarios was to add some individual finger thresholds for how much we’d allow the problematic fingers in these scenarios to be relaxed before we consider a hand as ‘not grabbing’ or ‘not pointing’. More often than not, the tracking system thought fingers were more relaxed than they actually were, instead of the other way around. We built these thresholds right into the place we decide to notify the user of the gestures the hand is making—right into the Oculus Hand Component.

Other Handy Utilities for Oculus Hand Component

We made plenty of modifications to the Oculus Hand Component source for our own custom gesture recognition, but we also made some modifications to it for some utility functions. One of the functions we wrote was a utility to get the closest point on the hand’s collision from some other point in world space. It additionally returns the name of the bone that is closest. We used this utility function for a variety of input verifications for different interactions.

SEE ALSO
'Ember Souls' Brings 'Prince of Persia' Vibes & Hack-and-Slash Action to Quest & Steam This Fall

For what it’s worth, we found that tracking reliability to the wrist bone was most consistent regardless of hand depth, so we did tests to that bone location more often than others.

Closing Thoughts

Hand tracking can be a really powerful, accessible addition to your game. For Myst, it took some work, but we worked ‘smart’ in that we tried to tie it into our existing input systems so we wouldn’t need to make too many overarching changes to the game as a whole. Our goal was to create a user experience that was intuitive and comfortable, without building an entirely separate input system on the back end.

Meta’s branch with Unreal Engine comes with hand tracking support out of the box and can definitely be used by a team capable of making engine changes. With that said, it needs some modifications to get it to recognize useful whole-hand gestures. We’re really looking forward to seeing Meta’s support for hand tracking in Unreal reach parity with what they offer in Unity. In the meantime, teams who are comfortable working with source-based projects in Unreal should find that they’ll have enough flexibility to get hand tracking to fit with their project.

– – — – –

We’re open to hearing what folks think about our process—and learning if there’s any interest in us providing a system like this in Unreal for folks to use and build upon. You can contact us at info@cyan.com for more information, or contact me (Hannah) on Twitter @hannahgamiel. Thanks for reading!

1
2
Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


  • I love these deep dives, thanks for sharing it!

  • tc tazyiksavar

    How about hand tracking support for the PCVR version through UltraLeap?

    • Sven Viking

      Kind of unfortunate Quest 2 now supports hand tracking on PC for development but not for standard applications. Best chance might be if some businesses end up wanting to use it on PC with Cambria.

    • ViRGiN

      what’s the point of that?

  • Guest

    Flying hands look creepy. It would even be more natural for the fingers to do the walking like Thing in the Munsters or Adams Family!

  • Arno van Wingerde

    Thanks for this article, it is really nice to hear what makers are trying to do – and helps when I find myself trying to do something but do not remember what is was again in this game. I still hope that we will get a VR standard way of doing things, wherever possible!
    I bought the game but put off playing Myst until I had a way of making notes as the puzzles are more complex than in most current games: has that been solved already?

  • Well this all just sounds extremely gimmicky and clunky (talking about the movement and turning part). :-o

    • Raphael

      Well, your detailed assessment sounds very penguin,

      • Penguin. . . . Is this some cool new kiddie terminology or something. I ain’t cool enough to know what lingo the kids are throwing around these days.

  • David

    please in your article you mention about how to fix system gesture spam. so, my question is how do I get to oculus input library to change the event? I have in my content browser put could not find it (I saw oculus input function library but when I open it, nothing related to system gesture). please I am beginner and have gone through so many documentations already.