X

Creating Games and Software with Cross-VR Headset Compatibility

With multiple consumer VR headsets now on the horizon, many users will soon be choosing to go with one over others. Will it be difficult for developers to support multiple VR headsets from a single application, rather than creating custom distributables for each individual headset?

This guest article is written by Yuval Boger, CEO of Sensics, a Maryland-based company specializing in professional-level head mounted displays since 2003. Boger also blogs about VR on his personal blog, VRguy.net.


Let’s assume that you wrote a game that runs on the Oculus Rift. You are starting to see hear about other VR headsets, such as Sony’s Project Morpheus, and ask yourself: how difficult would it be to modify my game to support these other HMDs? The good news, for developers and users, is that it shouldn’t be very hard.

Porting a game to work in VR for the first time probably took some work: creating two separate and non-overlapping view frustums, understanding the optical distortion function and how to implement it, positioning the camera correctly, remapping the location of the menu items to a visible spot, changing the interaction model to suit an immersive display and more. Adding support for another VR headsets on a similar computing platform (e.g. PC) would likely include the following:

  • Change the optical parameters: address a somewhat different field of view and likely a different optical distortion correction function (or no correction required at all)
  • Change the display parameters: potentially different resolution and screen orientation
  • Accept tracking data in a different format: while the API may be different, head trackers ultimately return quaternion or yaw/pitch/roll data, so the application needs to deal with them regardless of the particular format. Hand position trackers (such as the PlayStation Move) fundamentally provide XYZ coordinates and other position trackers would do the same.
  • If a VR headset has built-in speakers: Modify the model for directional audio to take into account that the position of the headphones moves with the head, as opposed to static speakers on a desktop.

Application developers would be well-served to create or use a hardware abstraction layer that separates the what (e.g. positional data) from the how (the specific API by which that data is obtained). I’ve previously written about the need for such abstraction layer, similar to VRPN for motion tracking. It would be great to have some kind of open-format descriptor for each VR headset, to be either read by the application, the game engine or the operating system. Valve’s VR SDK, for instance, is an excellent step in the direction of supporting multiple VR platforms.

How should we prepare for the forthcoming wave of VR headsets?

  • If you are a game developer, prepare your code to support multiple VR headsets and, if applicable, publish the descriptor that would allow this to happen. By doing so, you are likely enlarging the market for your product.
  • If you are a gamer, make your opinion known so that engine and game developers will work towards a multi-platform experience.
  • If you make VR headsets, publish their characteristics so that those who write descriptors for them have all the data that they need.

After all, why limit the market potential to a single type or vendor?

Related Posts
Disqus Comments Loading...