Oculus Touch is out tomorrow and we finally get to test just how well the optical ‘Constellation’ tracking system handles itself, all the way from rug scale to beyond roomscale.
The term “roomscale” has really only been in popular use in the VR industry for the last 18 months to 2 years, after the unveiling and subsequent demonstration of the HTC Vive, powered by Valve’s SteamVR architecture and their laser-based ‘Lighthouse’ tracking system, along with tracked motion controllers. Up until then, although acknowledging they were working on tracked controllers of some sort, Oculus had publicly adopted a firm policy of ‘seated VR’ as a target for their first consumer hardware experience, initially to be powered by a standard gamepad. Post-Vive however, it was clear the goalposts had moved as far as expectations for high-end VR experiences were concerned and ever since, we’ve been waiting for Oculus’ response to both hand presence and roomscale VR.
That response comes in the form of Oculus Touch, the company’s dedicated tracked motion controllers, powered by the same outside-in, IR LED-based ‘Constellation’ tracking system as the Rift VR headset. Unlike Valve’s Lighthouse, Oculus’ tracking solution is based on computer vision, with one or more sensors tracking the play space picking up arrays of IR LEDs mounted on supported devices. This solution has the advantage of simplicity for a seated VR scenario, one camera on the desk is all that’s needed to capture VR headset motion. But how does the same technology scale to enable a more free-roaming room sized VR experience?
Road to VR‘s intrepid Executive Editor set out to find the limits of Oculus’ camera sensors and found some expected weaknesses and some perhaps unexpected strengths in the different scenarios. Using the recommended play space sizes laid down by Oculus for 2 and 3 camera sensor configurations, Ben tests how far beyond those recommendations the system can reach and explores edge-case tracking.
Check out the full video at the top of this page and Ben’s full Touch review here, but key takeaways are:
- Front-facing (i.e. 180 degree only) room-scale experiences are handled surprisingly well by just 2 camera sensors in a ‘room-scale’ configuration (that’s one sensor below Oculus’ recommendation)
- Sensor accuracy is hard to fool even with high speed human movement
- Some design compromises might have reduced tracking effectiveness – i.e. more surface area for each of the Touch’s tracking ‘rings’ cpi;d yield much better tracking coverage and accuracy.
- Adding extra cameras increases redundancy to battle sensor occlusion but does not seem to greatly increase the potential size of the playspace.
Oculus Touch launches officially tomorrow worldwide.