Software
Road to VR: Describe the process of rendering a 210 degree stereoscopic image for use with the InfinitEye.
InfinitEye: At the present time, we are using 4 renders, 2 per eye. Left eye renders are centered on left eye, the first render is rotated 90° left and the second looks straight ahead, building two sides of a cube. Right eye renders are centered on its position, the first is rotated 90° degree right and the second looks straight ahead, two sides of another cube. We then process those renders in a final pass, building the distorted image.
Road to VR: What demos / software currently work with the InfinitEye and what engines / frameworks do they use?
InfinitEye: For now we have 3 demos and they use our custom engine. We plan to build a low level SDK in the near future, and then build support for popular engines based on it.
Road to VR: Would the SDK for the InfinitEye be an open source offering?
InfinitEye: …we plan to open source it, but only after our crowd funding campaign if we make it to this point.
Road to VR: Are you targeting any engines or middleware specifically? How about APIs such as DirectX, OpenGL or AMDs Mantle?
InfinitEye: Ideally we would like to target all APIs and middleware that allow rendering. That’s why we plan to build a low level SDK that can be easily integrated by engines and middleware. DirectX and OpenGL are the lowest level of accelerated rasterization, our SDK will target these, providing the functions to correctly setup the cameras and then perform the final pass, merging the renders and performing distortions. Mantle is not available yet, we’ll consider it when we’ll get more information about it.
Road to VR: How simple do you think it might be for engines such as Unity to include support for the InfinitEye? Could plugins be used by developers to get started?
InfinitEye: We’re currently looking into this. We have little knowledge of these engines but in the end the process of rendering frames is known and well defined. We need to find the right entry points to call our future SDK in order to output content for the InfinitEye.
Road to VR: What are the overheads like for rendering such a wide Field of View?
InfinitEye: Overheads are primarily for the graphics card. Wider FOV means more visible objects at the same time. Most of the games we play only display a 60° FOV by default. More objects to display means more vertices to process, more pixels to compute. The performance drop is difficult to predict, but we certainly don’t expect a linear performance decrease related to the FOV ratio.
Road to VR: Do you foresee game design challenges rending in 210 degree FOV?
InfinitEye: Not really, there should not be specific challenges related to the 210° FOV. On the contrary, a panoramic field of view offers more possibilities to VR game creators, for example they could add effects in the peripheral vision to improve the experience. A horror game with ghost appearances in the peripheral vision could make your heart skip a beat!
Road to VR: How do you intend to generate interest in your product and in particular how do you think you can get developers on board?
InfinitEye: Like every other hardware equipment, with quality content running on it. That’s why we have to make a development kit and have it in as many developers’ hands as we can. A crowd funding campaign might be the way to go.
Road to VR: What software correction routines do you have currently and which ones are planned? Are there particular challenges rendering for Fresnel lenses?
InfinitEye: For now we’re only addressing geometric aberration, we have tried different distortion algorithms, and in our demos we’re currently using an experimental fisheye distortion. It works quite well but we have plans to improve it further. Chromatic aberration is the other problem we have to address. It’s not a show stopper and we definitely have plans to address it.
Road to VR: Concerns have been raised over the syncing of frames between your two displays, in particular on Windows based systems. Have you seen any such artifacts? Is there anything in your rendering routine that combats this?
InfinitEye: We have never experienced this issue. From a software point of view we’re only displaying to a single 2560 x 800 frame buffer. Display synchronization is up to the graphic driver.
Road to VR: How did you correct the image for rendering, with a distortion shader like in the Oculus SDK or did you try to model a ray traced solution taking into account the surface equation and refractive index of the Fresnel lenses ?
InfinitEye: We use a distortion shader, as Oculus does. As our main content source is rasterization (OpenGL, DirectX) this is the way to go. When we’ll have ray traced content then yes we’ll perform ray perturbations to account for lens aberrations, but we don’t have any at the moment.
Continue to Page 3