360 video combined with real-time CG and interactive elements was demonstrated on stage at today’s Vision VR/AR Summit 2017 keynote. Using two layers of video with 3D elements in between, it is a simple way of enhancing standard 360 video footage to make it interactive.
Natalie Grant, Senior Product Marketing Manager of VR/AR/Film at Unity, showcased an interactive 360 video today produced by VFX studio Mirada, built using Unity 2017 Beta (see the video heading this article). Captured by a 360 camera placed under a gazebo, an animated dinosaur appears in the park outside. Looking up, birds animate in the sky. Both are real-time CG elements, and are convincingly occluded by the structural beams of the gazebo.
This is achieved by playing two layers of video simultaneously; the outer sphere plays the original 360 video, and the inner sphere uses a custom ‘alpha mask’ shader, to make everything but the gazebo structure transparent. Animated 3D objects like the dinosaur and birds can be placed in between the two spheres, resulting in an effective, inexpensive illusion of depth.
The demo also illustrated how text or markers can be positioned in 3D at places of interest, used to perform a ‘gaze-based locomotion’ effect, which simply swaps one 360 video one with another perspective, giving the feeling that the viewer has moved through the scene. Further realism-enhancing techniques were shown: replacing the sun captured in the original footage with a real time light source such that animated birds disappear (due to overexposure) when flying over across the brightest spot, and real time lens flares.
Unity says that compelling interactive content using standard 360 video should be possible using these simple techniques, and it is already available to try in the Unity 2017 beta.