Limitless, a company developing content and tools for creating cinematic VR experiences, has joined Lytro to build out tools for combining light-fields and real-time rendering directly in game engines.
As we noted recently, Lytro is positioning its light-field tech as VR’s master capture format. As the company is building pipelines—like Immerge and Volume Tracer—for capturing live action light-fields and generating entirely synthetic ones, the natural next step is for the company to allow easy combinations of the two, as well as real-time computer graphics.
In taking that next step, Lytro has welcomed the team from Limitless into the company. Limitless was behind narrative VR content like Reaping Rewards, which was built on a toolset designed to make it possible to animate on the fly, directly inside of VR. Now part of Lytro, the Limitless team is helping to build out the company’s game engine toolset, which Lytro says will allow users to seamlessly blend light-fields with real-time rendered content.
With integrations in the works for both Unity and Unreal Engine, Lytro’s goal is to make it easy for their customers to leverage the advantages of light-fields without giving up the advantages that come with real-time rendered content—namely, interactivity.
Light-fields are capable of capturing high-quality real-time volumetric video, or high-quality pre-rendered CGI visuals that go far beyond what can be rendered in real-time. The downside is that, because light-fields are pre-captured or pre-rendered, they can’t change in response to input, which means they can’t support the level of interactivity that real-time rendered content can—like, for instance, throwing a ball in an arbitrary direction and having it bounce off the floor, or rendering a character which can react to the user’s actions. That is to say—light-fields can work great for some things, but not everything.
Lytro wants to eliminate the need to choose between the quality of light-fields and the interactivity of real-time rendering by letting developers use the two interchangeably in a single project. In a recent meeting with Lytro, I got to see this in action—the company pulled their Hallelujah light-field into Unity as a point-cloud, and proceeded to modify the look of the scene using controls directly inside of Unity.
Beyond just playing with the color and lightning of the light-field scene, they showed how real-time elements could interact directly with the scene by throwing a bunch of beach balls around and adding real-time fog. They also showed simple but practical uses of working with a light-field in a game engine, like being able to easily composite text directly in the environment, mask out portions of the light-field scene, edit the scene’s playback, and combine multiple light-field scenes together.
While this is certainly a boon for VR storytellers already used to building experiences in game engines, these new game engine integrations seem sure to pique the curiosity of VR game developers too, who could find novel ways to combine the best of both the light-field and real-time to create an experience with both ultra high fidelity and immersive interactivity.
Lytro is still only making their tools available to those collaborating with the company directly, but if you’ve got an interesting project idea they encourage creators to get in touch.