Lytro, the once consumer-facing light field camera company which has recently pivoted to create high-end production tools, has announced a light field rendering software for VR that essentially aims to free developers from the current limitations of real-time rendering. The company calls it ‘Volume Tracer’.

Light field cameras are typically hailed as the next necessary technology in bridging the gap between real-time rendered experiences and 360 video—two VR content types that for now act as bookends on the spectrum of immersive to less-than-immersive virtual media. Professional-level light field cameras, like Lytro’s Immerge prototype, still aren’t yet in common use though, but light fields aren’t only capable of being generated with expensive/large physical cameras.

The company’s newest software-only solution, Volume Tracer, places multiple virtual cameras within a view volume of an existing 3D scene that might otherwise be expected to be rendered in real-time. Because developers who create real-time rendered VR experiences constantly fight to hit the necessary 90 fps required for comfortable play, and have to do so in a pretty tight envelope—both Oculus and HTC agree on a recommended GPU spec of NVIDIA GTX 1060 or AMD Radeon RX 480—the appeal of graphics power-saving light fields is pretty apparent.

SEE ALSO
Apple Releases The Weeknd’s Immersive Music Video Exclusively for Vision Pro

Lytro breaks it down on their website, saying “each individual pixel in these 2D renders provide sample information for tracing the light rays in the scene, enabling a complete Light Field volume for high fidelity, immersive playback.”

According to Lytro, content created with Volume Tracer provides view-dependent illumination including specular highlights, reflections, refractions, etc; and is scalable to any volume of space, from seated to room-scale sizes. It also presents a compelling case for developers looking to eke out as much visual detail as possible by hooking into industry standard 3D modeling and rendering tools like Maya, 3DS Max, Nuke, Houdini, V-Ray, Arnold, Maxwell, and Renderman.

Real-time playback with positional tracking is also possible on Oculus rift and HTC Vive at up to 90 fps refresh rate.

SEE ALSO
Valve Releases Hand-tracking Passthrough for Steam Link, Making Quest an Even Better PC VR Headset

One Morning, an animated short directed by former Pixar animator Rodrigo Blaas that tells the story of a brief encounter with a robot bird, was built on the Nimble Collective, rendered in Maxwell, and then brought to life in VR with Lytro Volume Tracer.

“What Lytro is doing with its tech is bringing something you haven’t seen before; it hasn’t been possible. Once you put the headset on and experience it, you don’t want to go back. It’s a new reality,” said Blaas.

Volume Tracker doesn’t seem to be currently available for download, but keep an eye on Lytro’s site and sign up for their newsletter for more info.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Well before the first modern XR products hit the market, Scott recognized the potential of the technology and set out to understand and document its growth. He has been professionally reporting on the space for nearly a decade as Editor at Road to VR, authoring more than 4,000 articles on the topic. Scott brings that seasoned insight to his reporting from major industry events across the globe.
  • Mac

    Looks very cool. I’m curious about reflective materials using this technology. Would the specular highlights and reflections dynamically shift with your changing viewpoint?

    • Ian Shook

      Yes – that’s basically one of the benefits of light-fields is that it can allow for this.

    • David Herrington

      Yes actually. Take another look at the video from Lytro with the chicken bot and go to the end of the clip where they zoom in on the bot. Pay close attention to the reflective metal feet and how the reflections change depending on the viewpoint.

  • Bear on the job

    The light field stuff is a great tool for animated content, but I don’t see how you could really use it for games. Since it’s all pre-rendered, you can’t interact with elements in the scene. You could still build a pretty good story-driven game with it though…analogous to the FMV games we used to see in the 90’s, but much better.

    • GigaSora

      You’re right about the dynamic qualities. This is more for pixar, ilm, and other video creators. More dynamic, less fidelity is kind of how it goes in graphics unfortunately.

    • Ian Shook

      Technically you could add-in other items that match in the 3D space. I think one of their videos shows this – the moon landing one. Some is CG, some is lightfield photo. The nice thing is that hdris can be rendered internally on the fly and that lighting added to the virtual object.

    • psuedonymous

      You can do the same as was done with compositing 3D real-time characters/objects and pre-rendered background ‘slides’ (the PSX-era Final Fantasy games are the examples most would be familiar with), but because a Lightfield is volumetric you are not locked to a fixed camera angle as you are with a pre-rendered image. If you render your lightfields along with a lightprobe (or dynamically generate a lightprobe from the lightfield if you wanted), you can do precise tonemapping of the rendered objects to match the lightfield ‘background’.

    • Lucidfeuer

      Google Seurat have a great-way of doing it. But having actual real-time lightfield rendering is going to take a lot of smart research, basically to get minecraft like voxel down to individual-pixel with an ambiguated light-volume path-tracing technic to quickly get pixels.

  • Angus Dorbie

    The original paper that introduced this concept: http://radsite.lbl.gov/radiance/papers/ewp98.pdf

  • Nate Vander Plas

    This is really cool in theory! I wish their examples were more compelling, though. As a 3D animator myself (for the advertising industry) the scene with the bird and the architecture scene both felt pretty low-fi. Obviously it’s probably different seeing it in VR, but from the screen captures, it didn’t look much better than typical real-time rendering from a VR game. I’d like them to show something that really can’t be done well with game engines, like realistic fur, refractions in curved glass, subsurface scattering, volumetric smoke, etc. I’m also really curious to know how much CPU/GPU power/time it takes to render one of these volumetric frames for a film like this, and how big is the volume?

  • VRgameDevGirl

    This looks really cool. I can’t wait to get my hands on it!

  • Graham J ⭐️

    Neat. Why can’t I watch One Morning on my Vive?

    PS an experience is what you have when you consume media, not the media itself.

  • amanieux

    how much memory is used ? is it small enough to be streamed ?

    • Austin Hull

      No. This atm is something that would be used by a theater or something.
      In high resolutions it was well over a few gigs of data per frame. Just imagine how much data is being stored. Thats the trade off.

    • Surykaty

      Otoy managed to stream this kind of lightfield content and play it on a mobile phone.. it is doable with some very clever tricks

  • Lucidfeuer

    A bit underwhelming. First in terms of implementation (3DS, Maya, Nuke…and no C4D, Elements3D, Unreal…nobody’s going to use it).

    Also how is this different from ORBX?

  • Ivan

    They seem to have deleted everything on their site…