Gearing up with 3-Space

DSC_0044

YEI was kind enough to let me try our their 17 sensor 3-Space system, which they’ve been using in Kickstarter and other demonstrations. They’ve produced a number of VR demos that work with the 3-Space setup, spanning both Unity and UDK, two popular game engines for Oculus Rift development.

Putting on all 17 sensors did take some time. It’s easy to see how the suit-style PrioVR layout is going to be much faster to equip.

DSC_0051

I don’t know what all this talk is from folks in the business end of gaming who say things like ‘gamers don’t want to wear stuff in order to play games.’ Pshh… after gearing up with the 3-Space system and the Oculus Rift, I felt like a badass from the future (but I’m sure I looked like a nerd).

Cube Demo in Unity

I got to step into YEI’s unnamed cube demo, built in Unity over the last few days, where randomly arranged red and green cubes float at you. The goal is to dodge the red cubes and touch the green cubes. It was fun, but also harder than it looks!

“I think I made it a little too hard,” Dan Morrison, R&D developer, told me as he explained that he wanted to make sure there was good reason for the player to move around.

Prior to trying the game for myself, I watched Morrison step around the virtual cubes.

“It’s a much more personal experience when you’re in there,” Yost told me as I watched.

And he was right. One objective is to not let the red cubes touch you. Not ‘you’ as in your player on the screen, as with most games today, but you as in you. That’s my body in there, and I really didn’t want it to touch those red cubes. I was avoiding them at all costs — they were in my personal space.

SEE ALSO
This Mod Lets You Play 'Star Wars Outlaws' in VR

The sense of avatar embodiment greatly enhanced the level of immersion compared to what I’m used to with just the Oculus Rift and Razer Hydra.

I mentioned that it might be cool to turn the experience into a Matrix-like game where the cubes are bullets in slow motion and you have to dodge them. Morrison seemed smitten with the idea while Yost started thinking about “non-harmful ways to deliver pain” if the bullets were to strike the player — a statement that had the three of us laughing when we realized how sinister it sounded… I think we might see more development with this cube game in the future.

Positional tracking on the Oculus Rift, afforded by the 3-Space senors, worked very well in the time that I spent in the demo. I was able to lean my whole body around and it felt very convincing with no perceptible lag. Positional tracking is going to add a lot to the Rift. And props to Oculus for a great fit on their development kit. Even while I was ducking, dodging, and weaving through boxes, the unit stayed snug to my face. The same cannot be said of the Sony HMZ-T1 that I used in the UDK demo (below) which required me to hold it securely on my face even if I needed to look down just a few degrees.

Oculus has said in the past that an IMU probably wouldn’t cut it for positional tracking on the Rift due to drift. However, given an entire skeletal model of sensors to work with, it seems an IMU based system could work very well.

With regards to tracking performance, the 3-Space system was very usable, but there’s some tuning left to be done to really nail the experience. I think it was a combination of  my avatar’s size not matching my actual size and some more work to be done on the walking kinematics, but I did feel a bit disconnected at times, like my steps weren’t carrying me exactly where I thought they were.

SEE ALSO
Microsoft is Launching Automatic Quest 3 Pairing on Windows 11 PCs in December

Walking in the game uses a kinematic model to actually plant your virtual foot on the virtual ground, then it tracks how far you step with your next foot, and so on. Morrison told me he was still coming to grips with Unity’s physics. Based on the experience I had in the UDK demo, it seems like physics tuning, not insufficient tracking, was the culprit. Yost told me that the PrioVR SDK, once complete, will handle most of that grunt work for developers, but they’ll also make raw sensor data available to those who might want it.

At one point during the demo, my feet got all out of whack. Morrison guessed that some interference in the floor (quite likely, given the studio space that we were in) was the culprit. This caused issue with the aforementioned kinematic walking, and made me drift toward the wall. With the press of a button Morrison told the software to track only the 11 essential sensors, rather than the full 17. Suddenly my legs were back and fully functional.

Yost tells me that a more robust calibration sequence will improve experience. At the moment they’re using a simple T pose calibration where you stand with your arms out parallel to the ground. Calibrating for a few other poses would help make things even more accurate. In the video you can see that touching my hands together in real life sometimes caused my virtual hands to cross. Without knowing how long each player’s limbs are, there’s no way to nail that. However, you can pose with your hands touching and calibrate, essentially telling the computer ‘this is what it looks like when my hands are touching together’, and then it will know for every time after that.

SEE ALSO
'Hitman 3 VR: Reloaded' Studio XR Games Lays Off Majority of Staff

Even with a single pose for calibration, I had no issue turning, ducking, and crawling around on the ground of inside the virtual space. As far as I could tell, the direction of my body never drifted either — I remained facing in the same direction in both the virtual and real worlds.

UDK Demo

DSC_0068

The UDK demo I tried was also using the 3-Space sensor system, along with the Sony HMZ-T1 (they haven’t yet rigged it up with the Rift); it’s the same one you see in the PrioVR Kickstarter video where they’re kicking over boxes.

It felt even better than the Unity Cube demo — the avatar embodiment felt really cool because my avatar was wearing armor. My limbs felt fully like my own. Mark my words: once we have these full body VR tracking systems widespread, avatar customization is going to be a big deal.

There were boxes on my left and right to be knocked over, which was somehow lots of fun (probably because actually using your body to do something in the game is a significantly more engaging experience than ‘press X to knock over boxes’ or ‘pull trigger to swing sword’ that we’re all used to. There wasn’t much else to the demo, but it got me very excited to see full implementation of PrioVR in a proper game.

Continue Reading on Page 3…

1
2
3
Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Andrés

    What are their post-Kickstarter plans? It looks like the current campaign won’t reach its goal.

    Any news about their plans for buttons/controllers?

    Thanks

    • Druss

      The whole reason I hate the Sixense STEM and love the PrioVR (besides the far superior tracking capabilities) is that it doesn’t have buttons, buttons have no place in VR. That’s so obvious to me, you don’t press a button to drink a glass of milk, you just pick up the glass. Something like the Peregrine glove is much better suited for VR, but a better solution is certainly needed.

      • EdZ

        Until we have fully haptic finger tracking gloves, then buttons are necessary. Vaguely waving your hands at an object is not a pleasant way to interact with it, and even non-haptic gloves are an exercise in frustration. Without tactile feedback it is VERY hard to manipulate objects with your fingers (try wearing a pair of leather gloves and tying your shoelaces, and that’s only reduced tactility).

        Personally, I wish I could afford to back both, but I cannot. At this point in time, the STEM is a more flexible solution for more tasks than the PrioVR. I won’t have full-body avatar immersion, but I will be able to track completely arbitrary points and objects & repeatably position things in world-coordinates rather than relative coordinates (allows for things like putting down a tracked gun prop in order to interact with an in-game object, then picking it up again).

  • druidsbane

    How’s the lag? I noticed in some videos there was choppiness, especially with the skeleton video at the end. Is this something that was due to it being a prototype? The Unity demo seemed much smoother overall. Great article as always.