oculus touch 2016 prototype hands on gdc (3)
See Also: Hands-on – Oculus Touch 2016 Prototype Brings Refinements to an Already Elegant Design

It was nearly a year ago, back at E3 2015, that Oculus founder Palmer Luckey told us that the company had plans to open up an API for third parties to tap into the Constellation tracking system. He was clearly enthusiastic about the ecosystem of peripherals that would be enabled by such a move.

“[Oculus Touch is] never going to be better than truly optimized VR input for every game. For example, racing games: it’s always going to be a steering wheel. For a sword fighting game, you’re going to have some type of sword controller,” Luckey said. “…I think you’re going to see people making peripherals that are specifically made for particular types of games, like whether they’re steering wheels, flight sticks, or swords, or gun controllers in VR.”

Valve too was bullish about their Lighthouse tracking system; more than a year ago, when the company had just revealed the HTC Vive, Valve head Gabe Newell said that at they wanted to give the tracking tech away for all to use.

“So we’re gonna just give [Lighthouse tech] away. What we want is for that to be like USB. It’s not some special secret sauce,” Newell told Engadget. “It’s like everybody in the PC community will benefit if there’s this useful technology out there. So if you want to build it into your mice, or build it into your monitors, or your TVs, anybody can do it.”

The latest Lighthouse sensors
See Also: Valve Shows off Miniscule Lighthouse Sensors

At one point, Valve engineers told us they envisioned one day releasing a ‘puck’—a small standalone tracker possibly about the size of a casino chip—which could be affixed to track any arbitrary object with Lighthouse. In the meantime, eager folks are not content to wait and have instead been taping the HTC Vive controllers to all manner of objects to emulate direct integration of the tracking technology.

SEE ALSO
HTC Cuts Vive XR Elite Price in More Regions Ahead of New Headset Reveal

[gfycat data_id=”RedUnfitCorydorascatfish” data_autoplay=true data_controls=false]

Developer Stress Level Zero attached one to a dog (who was virtually represented as an alien) with quite hilarious results. The same studio attached the controller to a huge sword prop for a more realistic way to play Ninja Trainer (now called Zen Blade).

lighthouse sandal tape

Yet another developer taped a controller to a pair of sandals to see what it would be like to have feet tracked in VR for a soccer game.

Valve’s Chet Faliszek, who has been closely involved with the company’s VR projects, recently tweeted a photo of a Vive controller which had been haphazardly hacked into a gun-shaped peripheral and teased about “lighthouse dev kits.”

It’s not entirely clear what Faliszek may mean by the tweet, but given the context it seems he could be talking about a development kit of a standalone tracking module for Lighthouse.

Given that it could mean increased developer interest, I’m quite surprised neither company has been vying to be first and best when it comes to opening up their tracking systems to third-parties. Surely both Oculus and Valve/HTC have been busy dealing with some not-so-smooth initial launches, but it’s clear that there’s demand from developers and companies to track more than just head and heads in VR.

1
2
Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Sam Illingworth

    I think the foot tracking thing can’t come soon enough. It will almost give us full body tracking, letting us play games where we have to avoid incoming objects by jumping and ducking and weaving, etc.

    I wonder which system will b easier to extend. On the one hand Lighthouse involves building advanced sensors into your equipment and doing the processing there in the equipment as well, whereas Constellation just means sticking some little lights on the device, but then you need to update the user’s Constellation system itself to understand how to track the new item, which could also add to the work the computer had to do.

    • Graham J ⭐️

      I’ve been imagining leg bands too. With that and the controllers I think IK could fill in the rest well enough. The thing is you don’t actually have legs in most VR apps today but in the future when seeing others’ avatars is more common it will become more important.

    • BlackMage

      Unfortunately Constellation isn’t as simple as sticking some lights on. There’s an article somewhere explaining how it all works but the devices need to chatter with the camera to pulse the lights at a specific frequency so that each device being tracked by Constellation can be uniquely identified. I think both require the same effort from a 3rd party perspective: buy the things to stick to your device from Oculus/Valve, wire them together so that they can send/receive data with the PC, plug it into the API and you get positional data.

      From the Oculus/Valve perspective lighthouse is overall easier as there isn’t really an increase of computational complexity per additional tracked device. There is no theoretical ceiling of simultaneous devices as each one only cares where it is in space and reports that data. Whereas each additional tracked device for Constellation becomes computationally harder and harder for one or two camera to track on their own. They’ve already had to add an additional camera just to handle 3 devices reliably after all. Constellation also requires gyros to fallback on for rotational data whereas lighthouse is currently using no fallback mechanism at all.

      • psuedonymous

        “From the Oculus/Valve perspective lighthouse is overall easier as there
        isn’t really an increase of computational complexity per additional
        tracked device. There is no theoretical ceiling of simultaneous devices
        as each one only cares where it is in space and reports that data.
        Whereas each additional tracked device for Constellation becomes
        computationally harder and harder for one or two camera to track on
        their own.”

        This is incorrect. The major difference between Lighthouse and Constellation is how the array of marker coordinates is populated. The model-fit process and sensor fusion (the actually computationally complex parts) are identical, and both will scale up in required CPU power with more tracked objects. Constellation will scale in computational power required with number of cameras, but the CPU power required to to blob tracking on a camera is utterly minuscule.

        As for implementation complexity: Lighthouse is pretty complex. Every device needs a precision timer and all devices need to be genlocked, every sensor needs an analog amplification frontend, and every sensor needs a DAC. Constellation also requires genlocking, but the timer granularity is a LOT looser (frame capture start, could even be implemented as a dumb radio pulse), and all the onboard logic required is to step through a set sequence once per sync pulse. You could literally use a shift register to implement it!

        • BlackMage

          Meant software complexity, all the complicated bits of the sensors and onboard parts both companies have already figured out and will probably just start selling them out like that little chip in the article.

          Adding additional devices that an API can track seems pretty much plug and play with lighthouse, the device communicates enough data for its position by itself and the data isn’t polluted by other devices. Valve don’t really have to do much to support 3rd parties from a software perspective in this case. Give an API that reports the positional data in the same coordinates as every other device in the room.

          Oculus need to keep their camera’s recognition software up to snuff to support every third party device, and I don’t think it’ll fly if you suddenly need three cameras to track one or two more objects. Any change in the way the camera works to improve support for their own devices suddenly need to be validated by any third parties.

    • Well it won’t help with kicking or doing your kung-fu moves, it is worth mentioning that jumping, ducking, crawling, and other motions can all be faked with inverse kinetics. After all, you’re body is a mechanism that needs to support itself with gravity. If you know where the head is, you can approximate figure out where the rest of the body needs to be to support it. Add in the location of the hands and, unless you take off and drop your gear, a body position isn’t too hard to figure out.

      • Sam Illingworth

        Yeah, but you can pull your legs up when you jump, it won’t know how well you’ve done that without tracking them.

      • bennymann

        Inverse kinetics ?

  • Graham J ⭐️

    I don’t see why Lighthouse can’t be used already since how it works is understood, hardware is relatively simple to create (IR photodiodes), the processing necessary to calculate position fairly is straightforward, and no interaction with HTC hardware is necessary.

    Rift tracking will be more difficult because Oculus will need to provide access to the software driving the cameras and you’ll probably need a device registration process so it can tell the devices apart. And they’ll likely lock down that whole process (like the app store and their signing system) making it harder for indies to get into.

    • crim3

      Lighthouse peripherals must communicate their position and orientation to the headset via bluetooth.
      Edit: Now, that I think, a custom peripheral could talk directly to the computer, so you are right.

  • NeoTokyo_N0r1

    Great article, nicely written. I hope it applies some pressure on them to announce something sooner : ) I am quite sure that valve is prepping the lighthouse devkit. It’s just that we want it in our hands like yesterday already! besides staying true to their vision of making it widely available. “like usb” I believe was the phrase..

  • veritas

    HTC CEO Cher Wang recently at the HTC Ecosystem Summit in Beijing sort of implied that they will ship out at least 1 million units of Vive (2016?) and come out with “Killer Product”. This killer product could easily be Lighthouse Tracking Module or Collar – I would be my guess.

    • I would bet on positional, maybe even hand, tracking for smartphone VR. Well mobile VR lacks the vast system resources of a $1500 gaming machine, VR through the GearVR is *PRETTY* good. It’s just the lack of positional tracking that tarnishes it. If their device wasn’t married to a particular cellphone brand, like their own, they could even take Galaxy phones away from GearVR. Bet that would tick Samsung off!

      • JoeD

        why would HTC care about positional or hand tracking for phones? They are focused on the high-end VR system.

        • CoD511

          The only reason we have displays available for use in VR is due to phones. Samsung has a 4K one already demonstrated at 5.5″ and if that’s not high end… (yes, it’s more pixels than the HTC Vive; 5 million for Vive and Samsung 4K goes to 8 million pixels.)

  • Brett Wagner

    Chet Faliszek at Valve just tweeted this the other day:

    https://twitter.com/chetfaliszek/status/726526658925064192

    So it’s in process, meanwhile mums the word from Palmer & Co.

  • Smokey_the_Bear

    I’m looking forward to VR accessories, I’ll take anything that makes it even more immersive! :)

  • RavnosCC

    I would love to wrap a bunch of lighthouse sensors around my office chair, so that way I can take a rest every now and again, and stay immersed. Or alternatively use it as a cockpit seat in flight sims/driving games, all w/o having to leave VR.

  • James Friedman

    I just don’t understand why Oculus hasn’t released the touch controllers yet? Are they broken? Are they trying to make them even better? It’s just very fishy to me seeing as there are plenty of games that would support it.

  • Theo M

    To be honest, it would be much better for Oculus to release an API because it would be much easier for developers to use. for example, to make a peripheral for the lighthouse system, a developer would have to put a lot of research into figuring out how to use the sensors, when with the constellation system developers could just strap IR LEDs to something and do the rest in code. also, this would be a much cheaper way to have room scale VR because instead of paying $100 more for a Vive you could just get a rift and use an IR wristband or something.