Oculus VR’s ‘Best Practices Guide’ collects the latest research on comfortable VR game design into one place to give developers a head start. As the first Oculus Rift DK2 units are shipping out, the company has updated the guide with a new section focusing on positional tracking.
Make no mistake, the Oculus Rift DK2 is a development kit. Although incredible progress has been made since the Oculus Rift’s phenomenally successful Kickstarter in 2012, VR is still akin to the Wild West, and developers are the early pioneers on a quest to settle the newfound territory—there is still much to be discovered.
To ensure developers don’t need to stumble over the same mistakes, Oculus VR’s ‘Best Practices Guide’ serves as a jumping off point for designing VR games free from sim sickness (nausea). The document was originally released in January, 2014, but the company stresses that it’s a living document as there’s always more being learned. On the eve of the Oculus Rift DK2 shipping out to developers, the latest version of the document has just been released. User Gadren from Reddit’s Oculus section points out that it includes a new section all about positional tracking.
Positional tracking, a feature which is new to the Oculus Rift DK2, allows the user’s head to be tracked when leaning in any direction. The Oculus Rift DK1 could only track rotational movements. Positional tracking significantly improves the experience by allowing users to naturally lean in closely to inspect things, lean around corners or over cover, and generally helps eliminate sim sickness by more closely matching what the brain expects to see.
See Also: An Introduction to Positional Tracking and Degrees of Freedom (DOF)
The 51-page document begins with an executive summary that we recommend all VR developers read. Here’s the new section on positional tracking for those interested:
- The rendered image must correspond directly with the user’s physical movements; do not manipulate the gain of the virtual camera’s movements. A single global scale on the entire head model is fine (e.g. to convert feet to meters, or to shrink or grow the player), but do not scale head motion independent of inter-pupillary distance (IPD).
- With positional tracking, users can now move their viewpoint to look places you might have not expected them to, such as under objects, over ledges, and around corners. Consider your approach to culling and backface rendering, etc..
- Under certain circumstances, users might be able to use positional tracking to clip through the virtual environment (e.g., put their head through a wall or inside objects). Our observation is that users tend to avoid putting their heads through objects once they realize it is possible, unless they realize an opportunity to exploit game design by doing so. Regardless, developers should plan for how to handle the cameras clipping through geometry. One approach to the problem is to trigger a message telling them they have left the camera’s tracking volume (though they technically may still be in the camera frustum).
- Provide the user with warnings as they approach (but well before they reach) the edges of the position camera’s tracking volume as well as feedback for how they can re-position themselves to avoid losing tracking.
- We recommend you do not leave the virtual environment displayed on the Rift screen if the user leaves the camera’s tracking volume, where positional tracking is disabled. It is far less discomforting to have the scene fade to black or otherwise attenuate the image (such as dropping brightness and/or contrast) before tracking is lost. Be sure to provide the user with feedback that indicates what has happened and how to fix it.
- Augmenting or disabling position tracking is discomforting. Avoid doing so whenever possible, and darken the screen or at least retain orientation tracking using the SDK head model when position tracking is lost.
The document goes into more detail on positional tracking in Appendix F, starting at the bottom of page 23.