Oculus VR Inc has released a draft Best Practices guide for VR developers that, in their words, seeks to maximize: 1) oculomotor comfort, 2) bodily comfort, and 3) a positive user experience. The document can be found on the Oculus VR web site. Among a list of recommendations for developers, it also reduces the minimum recommended age of using the Oculus Rift from 10 to 7.
My wife played the early Rift Coaster demo for around 15 seconds before ripping the Rift off her head and swearing the whole thing off. My 4 year old son seems unaffected by the dreaded “VR sickness.” Oculus appears to recognize that while no two people experience virtual reality the same, there are steps that can be taken to minimize (or even eliminate) negative feedback from the brain, resulting in a more positive and pleasurable VR experience.
Some of the major points in the document are well-known to the Road to VR crowd, and are reflected in the Rift roadmap (based on recent interviews). Oculus says that “code should run at a minimum 60fps,” and to “target 20ms or less motion-to-photon latency.” However, the document is full of advice that may seem obvious but needs reinforcement.
It’s imperative that head tracking work at all times, even in cut scenes and pause screens. It’s also important that the laws of “real life” are observed as closely as possible. “The camera should rotate and move in a manner consistent with head movements”, for example. Also, avoid rapid acceleration and deceleration as it “creates a mismatch between your visual and vestibular senses.” It’s unnatural to go from 0 to 60 miles per hour in 2.5 seconds and not feel the inertia of that experience; the discord between your eyes and your other senses can cause confusion.
The other items seem fairly common-sense once you understand the experience you’re trying to deliver. Don’t put scoreboard, HUD, and other dialogs too close or too far away; they should sit around 50cm from the user. If you attach a user’s view to a camera, don’t execute fast zooms or unexpected pans or tilts. Try to map the user’s view of the game world as closely as possible to the data coming from the Rift. If the brain detects a disconnect between action and reaction (either consciously or subconsciously), you’ve already lost and are creating disharmony.
While the document is still considered a draft, with a footer reading “Pending legal and medical expert review,” Oculus has reduced their minimum age recommendation from 10 years (according to the Oculus Rift DK1 manual) to 7 years. More Oculus Rift developer resources (including the SDK) can be found at the company’s developer portal: developer.oculusvr.com.
I’d highly recommend reading over the Best Practices document, even in you’re not a developer. These are the key items that will likely appear in all of the high-quality VR experiences over the next few years. Oculus has done much of the heavy lifting for developers by creating this detailed document, though they seem to realize that 39 pages might be a bit dense—as such they’ve also included an Executive Summary of their findings and recommendations which developers will find useful and enthusiasts may find interesting:
Oculus VR Inc’s Executive Summary of Best Practice for Virtual Reality Development
Rendering
- Use the Oculus VR distortion shaders. Approximating your own distortion solution, even when it “looks about right,” can still be discomforting for users.
- Get the projection matrix exactly right. Any deviation from the optical flow that accompanies real world head movement creates oculomotor and bodily discomfort.
- Maintain VR immersion from start to finish – don’t make a user look at static images fixed to the eyes through their Rift.
- Avoid displaying completely different content to each eye. If you have rendering effects, make sure they resolve across both eyes and do not contain random differences, as the brain will not fuse the image properly.
- Consider supersampling and anti-aliasing to remedy low apparent resolution, which is at its worst at the center of each eye.
Minimizing Latency
- Your code should run at a minimum 60fps, v-synced and unbuffered. Lag and dropped frames are discomforting in VR.
- Ideally, target 20ms or less motion-to-photon latency. Organise your code to minimize the time from sensor fusion (reading the Rift sensors) to rendering.
- Use the SDK’s predictive tracking, making sure you feed in an accurate time parameter into the function call.
Optimization
- Decrease render buffer resolution to save video memory and increase frame-rate.
Head-tracking and Viewpoint
- Avoid features that upset the user’s sense of stability in their environment. Rotating or moving the horizon line or other large components of the user’s environment can be discomforting to the user.
- The display should respond to head-tracking and viewpoint changes at all times, without exception. Even when the game is paused or displaying a cutscene, users should be able to look around.
- The camera should rotate and move in a manner consistent with head movements; discrepancies are discomforting.
Accelerations
- Acceleration creates a mismatch between your visual and vestibular senses; minimize the duration and frequency of such conflicts. Make accelerations as short (preferably instantaneous) and infrequent as you can.
- Remember that “acceleration” does not just mean speeding up while going forward; it refers to any change in the motion of the user. Slowing down or stopping, turning while moving or standing still, and stepping or getting pushed sideways are all forms of acceleration.
- Have accelerations initiated and controlled by the user whenever possible. Shaking, jerking, or bobbing the camera will be uncomfortable for the player.
Speed
- Users are most comfortable moving through virtual environments at real-world speeds; a walking rate of 1.4 m/s is most comfortable.
- Movement in one direction while looking in another direction can be disorienting. Minimize the necessity for the user to look away from the direction of travel, particularly when moving faster than a walking pace.
Cameras
- Subjecting the user to any change in perspective can induce uncomfortable feelings of movement and acceleration. Even a seemingly trivial movement—such as shifting the camera slightly to center it over a gun’s sight when entering an aiming mode—can be disorienting and uncomfortable.
- Zooming in or out with the camera can induce or exacerbate simulator sickness, particularly if it causes camera movement rates to differ from head movements.
- For third-person content, be aware that camera accelerations and movements can induce nausea regardless of what your avatar is doing. Furthermore, users must always have the freedom to look around the environment, which can add new requirements to the design of your content.
- Apply the Oculus head model to create an accurate and comfortable visual frame of reference.
- Avoid using Euler angles whenever possible; quaternions are preferable. Try looking straight up and straight down to test your camera; it should always be stable and consistent with your head orientation.
- Do not use “head bobbing”; it creates a series of small but uncomfortable vertical accelerations.
Managing and Testing Simulator Sickness
- Test your content with a variety of un-biased users to ensure it is comfortable to a broader audience. As a developer, you are the worst test subject. Repeated exposure to and familiarity with the Rift and your content makes you much less susceptible to simulator sickness or content distaste than a new user.
- People’s responses and tolerance to sickness vary, and visually induced motion sickness occurs more readily in virtual reality headsets than with computer or TV screens. Your audience will not “muscle through” an overly intense experience.
- As part of your user-configurable options, offer the option for a “monoscopic display” mode that sets inter-camera distance to zero (i.e., presents the same image to both eyes). This can reduce simulator sickness and eyestrain for sensitive users.
- Consider implementing mechanisms that allow users to adjust the intensity of the visual experience. This will be content-specific, but adjustments might include movement speed, the size of accelerations, or the breadth of the displayed FOV. Any such settings should default to the lowest-intensity experience.
- For all user-adjustable settings related to simulator sickness management, users may want to change them on-the-fly (for example, as they become accustomed to VR or become fatigued). Whenever possible, allow users to change these settings in-game without restarting.
In-game Impacts and Collisions
- Do not move the camera without the user’s control (even brief shaking or jerking) during impacts and collisions. It represents unexpected, uncontrolled changes to acceleration, orientation or rotation, which creates discomfort.
- Consider settings for user-adjustable camera behavior; lower settings would not allow impacts and collisions to affect the camera, whereas higher settings would.
Degree of Stereoscopic Depth (“3D-ness”)
- For individualized realism and a correctly scaled world, set left and right eye camera separation to the IPD from the user’s profile. Note that realism has its downside: Beyond a fairly close range, you will perceive little stereoscopic 3D. Resist the temptation to increase the inter-camera distance to enhance the stereoscopic depth effect.
- Avoid placing static visual elements that persist in the user’s view (such as a HUD) closer than 50 cm from the user in the virtual world. Converging the eyes on such close objects can cause eye strain and even make clearly rendered objects appear blurry. Users might still choose to position themselves closer to environmental objects, but you should avoid forcing them into such situations when possible.
User Interface
- UIs should be a 3D part of the virtual world and ideally sit at least 50 cm away from the viewer—even if it’s simply drawn onto a floating flat polygon, cylinder or sphere that floats in front of the user.
- Don’t require the user to swivel their eyes in their sockets to see the UI. Your UI should fit entirely inside the middle 1/3rd of the screen; otherwise, they should be able to examine it with head movements.
- Use caution for UI elements that move or scale with head movements (e.g., a long menu that scrolls or moves as you move your head to read it). Ensure they respond accurately to the user’s movements and are easily readable without creating distracting motion or discomfort.
- Consider having your interface elements as intuitive and immersive parts of the 3D world; for example, ammo count might be visible on the user’s weapon rather than in a floating HUD.
- Draw any crosshair, reticle, or cursor at the same depth as the object it is targeting; otherwise, it can appear as a blurry and/or doubled image when it is not at the plane of depth on which the eyes are focused.
- In general, avoid requiring the user’s eyes to make rapid and frequent adjustments in distance, such as switching focus between a distant object and nearby HUD element.
Controlling the Avatar
- User input devices can’t be seen while wearing the Rift. Allow the use of familiar controllers as the default input method. If a keyboard is absolutely required, keep in mind that users will have to rely on tactile feedback (or trying keys) to find controls.
- Consider using head orientation itself as a direct control or as a way of introducing context sensitivity into your control scheme.
Sound
- When designing audio, keep in mind that the output source follows the user’s head movements when they wear headphones, but not when they use speakers. Allow users to choose their output device in game settings, and make sure in-game sounds appear to emanate from the correct locations by accounting for head position relative to the output device.
Content
- Give the user a body. Looking down and having no body is disconcerting, whereas anything from a vague, ghostlike presence to a full character avatar can do a lot to ground the user in the virtual environment.
- Consider the size and texture of your artwork as you would with any system where visual resolution is an issue (e.g. avoid very thin objects).
- Unexpected vertical accelerations outside the user’s real world movements, like walking over uneven or undulating terrain, can create discomfort. Consider flattening your walking surfaces or steadying the user’s viewpoint when traversing such terrain.
- Be aware that your user has an unprecedented level of immersion, and frightening or shocking content can have a profound effect on users (particularly sensitive ones) in a way past media could not. Make sure players receive warning of such content so they can decide whether or not they will be able to handle it.
- In VR, simply looking at interesting shapes and textures can be a fascinating experience. Content that is simply window-dressing when played on a monitor can itself be a major focus of VR.
- Don’t rely entirely on the stereoscopic 3D effect to provide depth to your content; lighting, texture, parallax (the way objects appear to move in relation to each other when the user moves), and other visual features are equally (if not more) important to conveying depth and space to the user.
- Design environments and interactions to minimize the need for strafing, back-stepping, or spinning.
- Steady, forward movement is the most comfortable for users in a virtual environment.
- People will typically move their heads/bodies if they have to shift their gaze to a point farther than 15-20° of visual angle away from where they are currently looking. Avoid forcing the user to make such large shifts to prevent muscle fatigue and strain.
- Don’t forget that the user is likely to look in any direction at any time; make sure they will not see anything that breaks their sense of immersion (such as technical cheats in rendering the environment).
Health and Safety
- Carefully read and implement the warnings that accompany the Rift (Appendix L) to ensure the health and safety of both you, the developer, and your users.