Tony Bowren recently shared his rather ingenious solution to the problem of the Oculus Rift Development Kit's lack of positional tracking using the Razer Hydra on YouTube. He talks to us about this project, a little of himself, and walks us through the code changes needed to achieve Oculus Rift positional tracking with the Razer Hydra yourself. If you'd like to skip to the Code walkthrough, select 'Page 2' from the select box above or click here. The Oculus Rift Dev Kit and Its Greatest Shortcoming What's missing from the Oculus Rift developer kit besides a high resolution display? Positional tracking, or the ability to detect the position of a user in physical space (we went into more detail a little while back if you're interested). Tony Bowren caused a stir online when he recently posted a video demonstrating his clever solution to this issue. That solution was, the Razer Hydra. Cunningly strapped to the back of the HMD, one of the Hydra's controller units acted as a positional sensor and allowed Tony to use that data to customise the Oculus Rift Tuscany Demo, part of the Oculus SDK. One of the stand out moments of the original video (see above) is when Tony's son demonstrates leaning out of one of the windows and looking around, something not currently possible with the Rift dev kit alone. We reached out to Tony, eager to learn more about his approach to this issue and to find out a little more about him. Happily, he not only agreed to an interview but also to share his annotated code with us (see the code walkthrough on the second page of this article). Road to VR: Can you tell us a bit about yourself? Tony: My education is a Mechanical Engineering (Robotics Controls) Degree from UCI but I quickly went into gaming instead of engineering. My first graphics job was to make a 3D intro for Interplay Productions back in 1994ish. I have worked at Interplay and Acclaim before going into commercial and film work. I was an FX artist on Warner Bros' Osmosis Jones before coming back to games again. October will mark 10 years with NCsoft, 2 of that working on GuildWars cinematics and the other 8 making Wildstar. I am most interested in working on human / computer interaction especially as it relates to art creation. My goals in getting more involved with VR programming are to reduce the barriers for people to create in VR and create tools that make it fun and intuitive. I am interested in developing approaches that really disrupt the way people think about content creation. In order to do this, work has to be done on better input methods, specifically better tracking of head and hands. Road to VR: Would you describe yourself as a VR enthusiast? When did you become aware of the Oculus Rift? Tony: Growing up in the 80's I was always interested in the idea VR but there was never any way for me to really be "enthusiastic" about something I could never experience much. I heard about the Rift from some friends of who saw "John Carmack's" new goggles at E3 last year. I was immediately on Google, finding MTBS3D.com and educating myself on all the details. I woke up every morning and checked Kickstarter for the Rift, and finally on August 1st I was able to be the 22nd backer. Road to VR: What interests you about the Oculus Rift and where do you think it might lead the games industry? Is there one game in particular you're interested in seeing 'in' the Rift? Tony: What interests me about the Rift is the ability to put the user and the computer in the same "space". When I watch IronMan, and see Tony Stark physically interacting with all his data and models I get very excited to be able to work like that. I have always loved the Kinect and [PlayStation] Move, but to effectively use them I have to be 6 to 8 feet from the screen. Bringing that screen up to my face eliminates all that and suddenly all that technology becomes incredibly more compelling. Road to VR: Do you think there are significant implications for non-gaming interfaces and applications? Is there anything in particular you'd be interested in seeing? Tony: Minecraft, Skyrim, and any good flight and driving simulators would be fantastic. I would love to see an MMO in VR, but it would have to be designed with much less emphasis on UI and keyboard interaction than most currently are. I have thought a lot about these challenges and doing it well really does require significant design time. Road to VR: The Razer Hydra has been out quite some time (and some would say ahead if its time), do you feel it might be about to enter a renaissance with the reinvigoration of virtual reality? Tony: In terms on non-gaming interfaces, back in January I started playing with the Hydra in an attempt to track a head and create a virtual window into my computer. You can see the video here: Tony: Tracking with the hydra was a pretty effective test. I had tried using OpenCV to face track, but the processing was to slow. The Hydra seemed to have no computational overhead, but one thing I did notice was that the aluminum frame of the Macbook Pro definitely warped the magnetic field. Different areas of the house or desk also effected it. This is the primary weakness of the Hydra but because typical head motions are limited in range and speed, and because there are not typically a lot of metal object near your head, I feel it has potential. Road to VR: What interests you about the Razer Hydra? Tony: The Hydra interests me because it is a true 3 dimensional input device and it has buttons. I have played with many type of gestural input schemes that read hands, but they don't give reliable results. My mouse button ALWAYS clicks, it always drags and it always stops moving when I remove my hand. Without physical buttons, I have never been able to make any other gestural scheme work as effectively. If its frustrating and inconsistent then it will not replace the mouse. Road to VR: What next for you and the Oculus Rift? Any special projects you'd like to share with us? Tony: My next project is going to be with the Kinect and the Rift. I have all the code now to merge the two devices so once I get some time that is what I plan to play with. Head over to page 2, where Tony walks us through the code changes to add Hydra positional tracking to the Oculus SDK's Tuscany Demo. How to Add Positional Tracking to the Oculus Tuscany Demo with the Razer Hydra Before you begin you'll need: Visual Studio (Express) 10 and the Direct X SDK (June 2010) 1. Get the Oculus SDK: here. Get the Sixense SDK: here. 2. Compile and test the samples. In this case we are using the OculusWorldDemo as our starting point. 3. Load the OculusSDK\Samples\LibOVR_Samples_Msvc2010.sln file and make sure OculusWorldDemo is the Startup Project. 4. Add the include directories and libraries for Razer. OculusWorldDemo > Properties: C/C++ > General > Additional Include Directories: (Add) C:\dev\SixenseSDK_062612\include (or wherever you installed SDK) Linker > General.>Additional Library Directories: (Add) C:\dev\SixenseSDK_062612\lib\win32\release_dll (or debug_dll if you are in debug mode) Linker > Input > Additonal Dependencies: (Add) sixense.lib sixense_utils.lib 5. Add header files to top of OculusWorldDemo.cpp [csharp] #include #include #include #include #include [/csharp] 6. Add some protected variables to the OculusWorldDemoApp class: [csharp] sixenseAllControllerData acd; float hand_lf_x, hand_lf_y, hand_lf_z,hand_rt_x, hand_rt_y, hand_rt_z; float hand_rt_offset_x,hand_rt_offset_y, hand_rt_offset_z; float hand_lf_offset_x,hand_lf_offset_y, hand_lf_offset_z; [/csharp] 7. Find .. [csharp]OculusWorldDemoApp::OnStartup[/csharp] ..and add the Hydra setup to the end: [csharp] // Init sixense sixenseInit(); // Init the controller manager. This makes sure the controllers are present, //assigned to left and right hands, and that // the hemisphere calibration is complete. sixenseUtils::getTheControllerManager()> setGameType( sixenseUtils::ControllerManager::ONE_PLAYER_TWO_CONTROLLER ); [/csharp] 8. Find .. [csharp]void OculusWorldDemoApp::OnIdle()[/csharp] ..and add this to the end of the ,csharp]if(pSensor)[/csharp] block: [csharp] float scale = .0015f; //read the left controller and save the position values hand_lf_x = scale * acd.controllers[1].pos[0] ; hand_lf_y = scale * acd.controllers[1].pos[1] ; hand_lf_z = scale * acd.controllers[1].pos[2] ; hand_lf_offset_y += scale *acd.controllers[1].joystick_y ; hand_lf_offset_x += acd.controllers[0].joystick_x * acd.controllers[0].joystick_x * (acd.controllers[0].joystick_x > 0 .1 : .1) ; hand_lf_offset_z += acd.controllers[0].joystick_y * acd.controllers[0].joystick_y * (acd.controllers[0].joystick_y > 0 ? . 1 : .1); [/csharp] 9. Farther down in: [csharp]OculusWorldDemoApp::OnIdle()[/csharp] ..right before the command: [csharp]View = Matrix4f::LookAtRH(shiftedEyePos, shiftedEyePos + forward, up);[/csharp] ..insert the hydra’s offset values to the eye position: [csharp] //calculate the player's body rotation Matrix4f yawRotate = Matrix4f::RotationY(Player.BodyYaw); //get the hydra's offset from the last reset position Vector3f eye_hydra ( hand_lf_offset_x - hand_lf_x , hand_lf_y - hand_lf_offset_y, hand_lf_offset_z - hand_lf_z ); // adjust for body rotation so looking forward is always forward Vector3f eye_orientationVector = yawRotate.Transform(eye_hydra); //add new eye position to the base eye postion shiftedEyePos += eye_orientationVector; [/csharp] One problem with the early tests was the Hydra would not track correctly if you rotated your head sideways and then looked in that direction. This is because the Hydra is recording its offset in world space, but does not know that your entire body has been rotated in the scene. Keeping track of the BodyYaw, and then rotating the Hydra translations by that rotation value keeps everything lined up properly. Pressing the trigger in the Right Hydra will toggle between strafe mode and rotate mode. The head "modeling" in the original demo works but offsetting the viewers eyes in front or the rotation point on the head. This is very much like how the hydra bodyYaw calculations are done. However, with the Hydra being on the back of the head, opposite of the eyes, it effectively undoes the head modeling. To compensate for this, and get our nice head / eye modeling back, we have to increase the length of the headBaseToEyeProtrusion. You will see my adjustment variable hydra_head_offset= 2.5 that I use to scale this value. Play with this to get different amount of eye head modeling effects. You can download the compiled binary and the source below: [button color="orange" url="https://mega.co.nz/#!q4wUBSaQ!QiQRWzIdpJiPtZtMihAIoCKgfkUh-ijum5L-9_MXdI8" size="medium, small or default"]Download HydraHeadTracker.zip Here[/button] Disclaimer: Files are downloaded at the users own risk, RoadToVR accept no liability to any damage done as a result of use or misuse of the file(s) provided. NOTES for the download: The right hydra stick moves the player, holding down the trigger toggles between rotate mode and strafe mode. Pressing the left bumper resets the head tracker so if you need to move more forward just lean back and repress the bumper. A huge "thank you!" to Tony for his hard work and allowing us to share his work. You can discuss this article further over on our Developer Forums. You can catch Tony over at his YouTube channel here.