X

Framestore VR on Digital Lightfield Workflows & VR Ad Campaigns

When I was at SIGGRAPH this year, one of the hottest topics was figuring out the workflows for how to deal with the complexity of capturing, processing, and distributing volumetric digital light fields. Otoy’s Jules Urbach had a lot of insights, and a number of different high profile VR agencies like Framestore VR appear to be secretly working on their own processes for how to best capture and display live action volumetric capture.

LISTEN TO THE VOICES OF VR PODCAST

Michael Ralla
Johannes Saam

I had a chance to talk with a couple of people from Framestore VR about this, Johannes Saam, a senior software developer, and Michael Ralla, a compositing supervisor. They weren’t ready to provide a lot of specific details just yet beyond the fact that it’s a hot topic, and that they’re possible working on their own workflow solutions with some insights gathered form deep image compositing.

Here’s some recent footage from Lytro Immerge camera that shows how they’re compositing 6 DoF Volume Capture Inside Nuke:

Here’s the final VFX Build of “Moon” by Lytro that blends the live action 6DoF footage shot with Lytro Immerge on top of the composited environment.

It’s unclear whether or not the Lytro lightfield camera will need to only be used within a green screen environment. There’s a previous marketing video that has the disclaimer that “conceptual renderings and simulations are used in the video,” and so it’s unclear whether to not this camera is able to actually capture this type live action footage with objects in the near field somehow be able to accurately render the background parallax for any occluded portions:

What is known is that there are still a lot of open problems with digital lightfield capture and workflows, and that Framestore VR is one of the production studios that are actively investigating it.

Johannes and Michael also talked about some of the high-profile ad campaigns that Framestore VR has been a part of including one for BWM M2 that was like a race car shell game for keeping your eye on the right car as a 360 camera races down a runway. It’s received over 5 million views on YouTube, and is a great introductory experience for people new to VR to help train them that they are able to look around.

They also worked on an interactive meditative application called Lumen with Dr. Walter Greenleaf that uses procedural generation of content to grow trees, harvest blossoms to plant new trees and grow a forest around you. It is a part of the TIME Life VR initiative that just launched.

Framestore VR also created a Field Trip to Mars as a part of a STEM initiative from Lockheed Martin by replacing all of the windows of a school bus with transparent LCD screens. They created a Mars environment within Unreal Engine, and then matched the real-life bus movements with virtual Mars rover movements to create a collective virtual reality experience for a group of school kids.

They also produced the Game of Thrones Ascend the Wall VR experience that premiered at SXSW 2014, which was one of the first high-profile advertising campaigns using virtual reality.


Support Voices of VR

Music: Fatality & Summer Trip

Related Posts
Disqus Comments Loading...