Tracking is the foundation of great virtual reality. Knowing the position and orientation of the user's head is essential to being able to move the virtual world convincingly around them as they move through it. Oculus and Valve/HTC have the leading tracking systems (dubbed 'Constellation' and 'Lighthouse', respectively), but soon a new entrant could join the VR tracking arena. Into the Workshop It's an unusually rainy day in the hills of Livermore, CA, as I stroll down a row of small connected office buildings. Door after door with company logos prominently featured in the windows, but it isn't until I find the door identified with only a small oval sticker that I stop. Collectors hoarding VR memorabilia 50 years from now might recognize the sticker—adorned with the letters 'VR' in the center and tiny 'Oculus' text at bottom—as one given out by the company in its earliest days. Fittingly, behind the door is the personal workshop of one of company's earliest employees. Jack McCauley served as Oculus' VP of Engineering up through the Rift DK2. He was among four employees featured in the company's 2012 Kickstarter video (not counting John Carmack and Michael Abrash, who would both join years later). Much of the footage from the Kickstarter video was actually shot in McCauley's workshop where I am now standing. By some definitions (including his) McCauley is a co-founder of Oculus due to his contributions and status as one of the earliest hires, though as far as Oculus itself is concerned, the company represents Palmer Luckey as the one and only "Founder" (with a capital F). As a seasoned design engineer who played an instrumental role in the creation of the Guitar Hero peripherals, McCauley brought hardware design experience to the young company, which was about to embark on designing, manufacturing, and delivering thousands of Rift DK1 headsets to eager backers of the crowdfunding campaign. McCauley takes me through a small lobby space, which is adorned with Guitar Hero and Oculus related placards of recognition, into what he calls 'McCauley Labs.' "Lab," I suppose, because what goes on inside is not just crafting, but much experimentation, and "McCauley Labs," (in the form of a business title) perhaps because he employs several staff members who have complementary expertises, and explore projects he doesn't quite have time for. [gallery type="rectangular" ids="43802,43811"] The workshop is a veritable wonderland for the hacker/maker type, littered with industrial-grade equipment that McCauley uses for work and hobbies alike. As I make my way to the back of the space, I see that the workshop's walls are covered with graffiti-style art with characters and logos from video games, including a prominent depiction of a Lola T70 sports car. [gallery type="square" ids="43803,43801"] McCauley positions himself at the edge of a large open space near a garage door in the back of the workshop and asks me to step aside. As I clear the area, a car elevator—the kind you would find in a parking garage—begins to lower from overhead. McCauley had the lift installed so that he could store his vehicles on the second floor of his office. As the lift reaches the ground, I see the project that's currently occupying much of his time: a half-finished Lola T70, the same one from the wall. McCauley is building his own, mostly from the ground up, and he plans to race it when he's done. He points out that the chassis is the same geometry as the original, but this one is TIG welded instead of riveted. It's clear at this point that McCauley is very hardware oriented, but a tinge of tech begins to shine through as he explains his plan to build the steering wheel to accommodate an Android tablet which will talk to the car's engine via Bluetooth, displaying all its vitals in a central location. When I ask if he knows yet which app he'll use for that purpose, he tells me he plans to write his own. What I Came to See At this point McCauley returns the lift to the second floor and we head back to the middle of the workshop. Amidst huge tool boxes and complex machinery is a small 3D printed white enclosure attached to a wheeled worktray by a GoPro mount. This is what I came to see. The face of the enclosure is about the surface area of a square box of tissues, but only about a quarter as deep. The back is seemingly missing a rear plate, which gives a clear view of an array of circuit boards and wires. Like Valve's Lighthouse system, this is a laser-based tracking system. But unlike Lighthouse—which sweeps the room with lasers regardless of what's in the area—this one actively seeks its intended target. The MEMS Tracking System (or MTS, which I'm going to call it for ease of use) shoots its laser in a unique way compared to Lighthouse. Lighthouse uses lenses to stretch the laser's landing area from a point into a line, then it sweeps those lines around a space by mounting the lenses on precisely spinning motors. MTS on the other hand, uses a tiny mirror (which tilts but doesn't spin) to point the laser in any singular direction. When initialized, MTS begins scanning a cone area in front of it in a grid-like pattern to search for its target. The target it's looking for is a point of high reflectivity, measured by the amount of laser light that's returning to the origin point. Of course, the laser could stop on any slightly reflective surface, so you need to make sure that the target is much more reflective than anything else in the environment, then set a minimum floor for detection, such that the laser continues to scan until a sufficiently bright reflection is found. A retroreflective marker, placed on a headset or motion controllers, serves as that sufficiently bright object, ensuring that much of the laser's energy is reflected back to the receiver. [caption id="attachment_43807" align="aligncenter" width="681"] The MTS laser fixated on a static marker attached to a Gear VR headset.[/caption] For a proof of concept, MTS is impressive. When McCauley first demonstrated the system, a small spherical marker attached to a Gear VR headset was sitting about 5 feet away. He turned on the MTS basestation and I watched as the laser slowly indexed the scene looking for its target. It started, quite logically, at the top left, and ran horizontally until reaching its right-most limit, then returned to the left, dropped down slightly, and continued from there, line-by-line. When it reached the marker on the headset, it stopped. The initial scan took 4 or 5 seconds. So at this point I figured MTS was a neat concept, but there was still much work to be done to reduce 5 second cycle down to mere milliseconds so that the tracking would be fast enough for practical use. It wasn't until McCauley picked up the headset and started moving it around, with the laser continuously fixed on the marker, that I realized the initial scan was probably slowed down for human benefit, and that MTS was already capable of high speed tracking. https://www.youtube.com/watch?v=cR8y4wG8Ipg Once the marker is found, the mirror continuously aims the laser at the object as it moves, constantly seeking the point of most intense reflectivity. Assuming this can be achieved robustly, tracking the object's position on an XY coordinate plane is as easy as reading the angle of the mirrors that are aiming the laser. With one additional marker on the tracked object, or an additional MTS basestation, you now have all the angles needed to triangulate the object's XYZ position. Continue Reading on Page 2... McCauley says MTS is a 'vector' system because control of the laser is done by analog means. The laser isn't stuck to a quantized grid of possible positions; it can be pointed essentially anywhere within its field of view, only limited by how carefully you can apply voltage (which corresponds to the tilt of the mirror). This means it can be highly precise, even at a great distance. [gfycat data_id="EverlastingOptimalBighornedsheep" data_autoplay=true data_controls=false] A look at a tiny MEMS mirror in motion, less than 1mm across. McCauley showed me the system locking onto a tracking point and maintaining it some 20 feet from the base station. He told me the laser can "easily go 60 feet and track" with the gain turned up (though he did mention it might not have been eye-safe at that amplitude!). He talks about the strengths of the system compared to a camera-based approach. "I can't emphasize enough the computational simplicity and low cost of my system. No frame buffers, large memory arrays, USB 10.0 cables or anything like that," says McCauley. "Just some control theory, a simple processor and some electronics. It's also much, much faster than a camera will ever be. It is clearly simpler with much greater range and bandwidth." [caption id="attachment_43810" align="aligncenter" width="681"] McCauley gives a refresher on the principles of Lighthouse.[/caption] McCauley is clear to point out that much of what comprises MTS are off the shelf parts and algorithms pioneered for other purposes. The MEMS mirror is a complex manufacturing task which he says requires a chip foundry to create, though they're already in production for a range of applications and could be cheap at scale. "That little mirror can be flicked into position very quickly. It has little mass and it has a powerful actuator. The entire thing is machined onto a single piece of silicon." He calls the tracking algorithms, responsible for identifying and following the reflective markers, "well known in the art" and says they were pioneered by professor Kris Pister of UC Berkeley. "Why nobody [applied this tech] for VR/MOCAP, I do not know. Perhaps nobody thought of it recently but it's 16 years old and very established. They use these mirrors for optical switches in fiber optic bundles," McCauley told me. "Sometimes people get stuck in a mode of thinking about a problem. If you learned only to work with cameras then you will use a camera. I'm not a camera guy so I would not consider using a camera so I have to find some other way of doing it." There's more than can be done with MTS beyond what was demonstrated. For one, the laser that's tracking the object can be modulated for transmitting data to the headset. That could mean that MTS could work (like Lighthouse does now) as a self-contained 'dumb' system, which doesn't need to be connected to a host PC or to a headset. It could simply broadcast the tracking data through the laser, which could be received by anything on the other end, whether it is a VR headset tethered to a PC, or a mobile headset. In this scenario, the MTS basestation doesn't need any knowledge about the object it is tracking, it just needs to be able to aim well and shoot the right data at it. McCauley also supposes the system can achieve full 6DOF tracking from a single marker when combined with data from an IMU in the tracked object. He admits he isn't 100% on the math for this and says he has someone working on it. My gut tells me this might be technically possible, but would probably lack sufficient correction for IMU drift to be useful. It's a seemingly moot point anyway, as additional markers are simple to add. Continue Reading on Page 3... Vector Not Raster Before the MTS demonstration, McCauley explained why he thinks this cameraless approach is important. Primarily it's about range and cost. McCauley said that while he was at Oculus and the company was working on their first camera-based system for the DK2, he quickly picked up on the range problem. The company had talked for a long time about room-scale capability, and McCauley didn't see the camera approach as sufficiently scalable to those distances. He explained that the range of a camera based approach is limited by the image sensor which is raster-based. The Rift DK2 camera has a resolution of 752×480. The headset of a user sitting just a few feet away can only be seen by a small portion of those pixels (as the view of the headset only takes up a portion of the total pixels that comprise the scene). As you get further away, the headset is represented on fewer and fewer pixels which means the computer has much less data to work with, McCauley says. You can think of it like this: if at 8 feet from the camera the headset only takes up 94x60 of the 752x480 sensor, it's essentially like trying to track the headset with a 94x60 pixel camera up close with the headset filling its entire field of view. The further away you move the headset, the lower resolution your camera becomes (in a sense); there's no effective means of zooming the camera in when the headset is at range so that it can use more of its image sensor. Several tricks have been devised to counter this reduction in available pixels at range, like dynamically boosting the LED brightness to create a larger light source for the camera to spot, using the flashing of LEDs to glean additional information about the tracked object, and utilizing dynamic exposure of the camera. At a certain point however, the resolution of the camera-based tracking becomes the fundamental range-limiting factor. [caption id="attachment_10497" align="aligncenter" width="680"] The Oculus Rift DK2 and Positional Tracking Camera[/caption] The obvious fix then is to increase the resolution of the image sensor, but that racks up cost quickly and USB bandwidth becomes a bottleneck, McCauley says. So he opted for a vector-based approach; one which would not be stuck with a set resolution, meaning that, in theory, it could track with equal precision at 5 feet or 50 feet. McCauley says that Kris Pister, the professor who pioneered the tracking algorithms in MTS, has used a similar system to track a drone in the air more than one mile away (though I would guess at that range we're far removed from the realm of 'lasers you can legally point at a person'). Because MTS only has to stream the values of the angle of the laser, the solution is very low bandwidth compared to sending and processing a high resolution image at 60Hz or more, says McCauley. Beyond Proof of Concept The system isn't perfect. There were plenty of times where I saw it lose tracking, and it isn't integrated with any apps at this point so I wasn't able to actually look into a headset and see how precise the tracking was. But McCauley's goal is only to demonstrate the concept, and it appears he's well on the way. There's still tons of room for optimization to get the system working in tip-top shape. Ultimately though, he doesn't intend to be the purveyor of MTS. "I'm gonna let someone else [commercialize it]. What I'm gonna do is put the system together to let someone else try to get this to work. I can get the components... the companies on board to provide the hardware to build the thing and get it debugging in some rudimentary form," says McCauley. "But to get it actually integrated with an application? I don't ever intend to do that. I'm just going to make this thing to prove it can be done. That's the only interest I have." When I press him on this, he says he has no interest in spinning up a company for the technology. He seems happy to be taking a break after Oculus, and has plenty of work left to do on his Lola T70. But it doesn't sound quite like he's doing this as an academic endeavor, where he'll simply publish his findings for just anyone. Instead, McCauley is considering looking within his network to find the right partners to make MTS a reality. "I have access to all the foundries and stuff and the silicon which is high value. And I have enough friends that if I say 'that thing is gonna go' or 'we're gonna do this', they'll be on board," he tells me. "If you're at a small startup somewhere—even a medium sized startup—you'll have a tough time getting people to [take the risk on you to get this built]. All the engineering that goes into making this is an enormous expense, but it's already kind of done [referring to the foundries that craft the MEMS devices]... to get those kinds of resources is very hard to do for a small company but I'm pretty well connected."