Google this week introduced a new “High Quality” option for their Jump Assembler, an automated 3D 360 video stitching service for creators shooting with Jump cameras. The new approach brings more accurate stitching with enhanced depth maps resulting in sharper 3D output.
In a blog post this week, Google software engineer David Gallup detailed the new High Quality stitching option for creators using the Jump Assembler. He explained that the prior method uses an “optical flow” approach, which uses pairs of cameras to match pixels between each camera’s perspective, and then interpolate the view from one camera to the other.
While the old method will still be available, the new High Quality algorithm expands the number of cameras which can contribute to determining the distance to objects. As Gallup puts it, “[…] the new multi-view stereo algorithm computes the depth of each pixel (e.g., the distance to the object at that pixel, a 3D point), and any camera on the rig that sees that 3D point can help to establish it’s depth, making the matching process more reliable.”
Side by side photos of the output from both the original method and the new method shows cleaner stitching, especially with thin, isolated objects, which are notoriously difficult to stitch reliably:
The High Quality stitching option not only makes for more accurate alignment of overlapping views, it also produces a much cleaner depth map, which improves the 3D effect when viewing the content and makes VFX work easier for creators. Gallup shared a comparison:
In the depth maps you can see not only a significantly sharper outline of the subject, but also much better detection of the rods, which are almost ‘invisible’ to the depth map with the Standard method.
Gallup says that 360 filmmakers using the Jump Assembler can now access the new High Quality option through the Jump Manger program for their next stitch.