The 360 film industry is still young and it’s maturing rapidly. With his name on more than 20 360 film projects to date, Armando Kirwin has had an inside view as to where things started, where they’re going, and why.

Guest Article by Armando Kirwin

Armando Kirwin has worked as a Director of Virtual Reality, Post Producer, Executive Producer, and Head of Post Production at companies like Here Be Dragons (formerly known as Vrse.works) and as a freelance filmmaker. He has helped create more than a dozen VR projects to date. These projects have earned one Emmy nomination, the Grand Prix at Cannes, and numerous debuts at Tribeca, SXSW, and Sundance.

There was a time in Hollywood when high-end film cameras were all proprietary and the cost of film stock per movie could be in the hundreds of thousands of dollars. Powerful incumbents enjoyed this tremendous barrier to entry until the transition to digital technology democratized the playing field.

Custom Cameras

When we started making cinematic 360 video a few years ago there was little pre-existing technology for us to use so we had to undertake the costly process of building our own.
The camera was the single biggest issue. There simply was no professional VR camera at the time and, for the companies that could afford to create one, it became a huge competitive advantage to do so. I’ve been told that some proprietary VR cameras cost as much as ten million dollars to research, develop, and tool.

But the truth is that — even with millions of dollars at our disposal — we had no business building cameras because things like optics were way beyond our ability to engineer. To this day, no true custom optics have been developed specifically for professional 360 cameras. Nor have any sensors.

SEE ALSO
Quest 3S Hands-on: Quest 2 Visuals with Quest 3 Power at an Unbeatable Price

And that wasn’t even the hardest part. In addition to engineering limitations, there was another major hurdle: software. VR cameras are typically made up of multiple small cameras arranged in a sphere, and after you’re done shooting you have to blend all of those cameras together into something seamless. This is known as ‘stitching’ and it was done manually by VFX artists.

The Stitching Hurdle

Stitching is an extremely labor intensive process. Imagine trying to blend 8, 17, or even 24 different cameras so that they all have the same exposure, color balance, etc. for every single frame, 30 or 60 frames per second, in 3D. In fact, because this was mostly manual labor, it cost between $8,000 and $20,000 per minute of final product (e.g. a five minute film could cost $40,000 to $100,000 just to stitch). It also took about one week per minute to complete.

Solving the stitching problem through automation required the development of complex software algorithms and once again no one, including startups with tens of millions of dollars in funding, had the expertise they needed to do right. Unfortunately, some of them pretended that they could and it ended up hurting the industry because it downplayed the true cost and difficulty of making cinematic 360 film , a trend that continues to this day. The truth is that only large companies like Facebook and Google were in a position to try, and the good news is that they stepped in to do it after about two years.

However, not even these mighty giants were able to fully solve algorithmic stitching (which is today achieved using something called optical flow). I would subjectively say that optical flow solutions are currently reaching a success rate of around 80–90%, with Google’s ‘Jump Assembler’ being the market leader. This makes sense because, generally speaking, algorithms are Google’s speciality.

A recent project that I did, which at first seems like a simple film noir series starring famous actors, is in fact a technical showcase of what you can achieve using nearly 100% optical flow stitching with minimal manual VFX. And because we had almost 20 minutes of final product, we saved our client, The New York Times, about a quarter million dollars in stitching (we also finished all nine films in only three weeks). This is why it makes sense that VR professionals are willing to spend $15,000 on the GoPro Odyssey camera (which is compatible with the Jump Assembler) or others like it.

VR camera improvement over time: (L) GoPro Odyssey modified with external top camera, third-party wireless preview, and external battery (not pictured). (R) Yi HALO with fully integrated top camera, preview, and battery.

Off-the-shelf is the New Proprietary

The GoPro Odyssey was only step one. As one of the first and only people in the world to use the best optical flow cameras on the market (including the Facebook Surround 360 and Google’s upcoming Yi HALO), I predict that optical flow-based solutions are one or two generations away from being completely reliable for most forms of production. This means that in ~18 months anyone will have access to the same caliber of technology as the very best proprietary systems held by a handful of companies today.

Even more exciting is that I am only aware of two proprietary systems in the world that are objectively better than the current generation of optical flow cameras and the quality gap isn’t even very large.

Other proprietary VR technologies were easier to solve than the camera. When we started there was actually no way to even show 360 film online once you were done shooting and stitching it. It sounds ludicrous, but it’s true. Dozens of engineers had to be hired to create custom apps. Now you can buy a white label app solution for a few thousand dollars or upload for free to YouTube and Facebook.

Nor were there any useful 360 spatial sound engines or microphones. Once again, engineers were hired and a lot of money was spent. Today there are multiple sound formats that are supported and even some proprietary systems that had a flash of success, like Two Big Ears, were purchased and made freely available by Facebook. I am only aware of two remaining spatial sound engines that are objectively better than the open solutions already on the market.

It’s now clear that early cinematic VR companies were very much like the old Hollywood incumbents. This all seems so logical in hindsight, but we ignored the obvious lesson that the means of production in Hollywood was once proprietary and eventually became completely democratized… and that it was inevitable that 360 film would quickly follow this exact same path.

Prototype Zcam MFT camera. Look for companies like Zcam to release sub-$9,000 cameras this year that compete with multi-million dollar cameras from two years ago.

As an industry we knew it was happening, but we downplayed this fact when it came to the stories we told investors, customers, and each other. At this point I would consider any company talk about “proprietary” cinematic VR production technology to be a red flag.

– – — – –

There is nothing wrong with this transition. It’s simply how production works. And, in a lot of ways, this is also 360 film reaching parity with VR games: VR game creators essentially all have access to the same tools (i.e. pick Unreal or Unity and get to work). The primary competitive advantage is creativity. And that’s the way it should be.

SEE ALSO
25 Free Games & Apps Quest 3S Owners Should Download First

We should all be very excited about the new creative voices that will start to emerge because of these trends. However, the democratization of 360 film has turned the entire nascent industry upside down, and I’ll talk more about that in the next part of this series.

Read More in This Series


Disclosure: Armando has worked with Facebook on a contract basis to produce 360 film content. Facebook was also a client of Milk(vr). The New York Times was a client of Vrse.works where Armando worked on a variety of projects in various capacities including Post Producer, Producer, and Executive Producer and Head of Post Production. The New York Times was also a client of Milk(vr) where Armando was Director of VR. Armando had access to a pre-production Yi HALO camera for an unreleased project, but was not compensated by Google. Armando also had access to a prototype Zcam MFT camera, but was not compensated by Zcam.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


  • Francesco

    For a moment I thought he had worked on 22,360 film projects to date :D

  • William Volk

    I was at James Cameron’s EarthshipTV in 2000/2001, along with Sherri Cuono. Not only were we doing 360, we were doing interactive 360 with hot-spots and navigation. Even stuff shot on aircraft, boats, etc. The equipment was at least $500k. It was crazy-difficult.

    But that’s not what held it back. There simply wasn’t the widespread availability of HMD’s that we have today. For most people it was more like a QuicktimeVR experience. On a computer, changing point of view by dragging the image with a mouse.

    I’m still surprised that some of the stuff we did, over 16 years ago, is missing, but we’ll get to that soon enough :-)

    William Volk
    Forward Reality

    • Larry Rosenthal

      yeah. i worked with VEON? cant remember name. hot spot video. did it in 360 panoramas the same time – early 2001 ish. actually done for web at 10 times less than what you did. problem is a ROI/ ROE ratio and more importantly. flash was ALIVE.:)

      • William Volk

        Motion 360? Just curious.

        The rig we used had 6 cameras recording onto 10k rpm drives.

  • Finally, someone in the industry with credibility. A refreshing and insightful read.
    I agree with Armando’s assessment of the state of the (Video VR) industry today. Not knowing he was involved with the NYTVR film noir piece, it was one of the few that stood out and I’ve said so; http://realvision.ae/blog/2016/12/nytvr-creates-cinematic-history-with-vr-film-noir/

    I’ve told two of the current China based cameras manufacturers, that today, Google’s optical flow algorithm scores over the FB/ open source one they’ve implemented. Working behind the scenes (with no compensation) to point out the stereo anomalies their current optical flow implementations are causing, in stereo footage they’ve released.

  • psuedonymous

    The elephant in the room is that ‘360 video’ is a dead technology walking. Any time and money invested is spent on a temporary stopgap between today and the date that lightfield cameras are available. panoramic cameras (stereo or otherwise, even with Optical Flow-extracted depth channels) are the Two Colour Process of reality-capture for VR.

  • François Lanctôt

    The Chinese ZCAM is no competitor for the Nokia OZO or the JAUNT One. Their footage is horrible and they don’t have 3D. As for the Facebook Surround, I am still waiting to see some convincing output…

    • >>The Chinese ZCAM is no competitor for the Nokia OZO or the JAUNT One. Their footage is horrible and they don’t have 3D…

      Actually it does. The ZCam V1 Pro. Stereo can even be ‘massaged’ out of the ZCam S1Pro if you know how.
      It’s the Ozo that is masquerading (pun intended) as a video VR camera when it’s not one. Look at the Masqurade ball – 50 Shades of… video in the GearVR and see how immersion breaking it is with one of the central characters going from stereo to mono right through the middle of her body. Various other Oculus Studio funded pieces show this as well in the /VR for good videos, found in the Ocuslus video app.

      Jaunt is improving steadily, but still has that huge Jaunt crater below, and just a “band” of 3D in the middle.

      • chuan_l

        Image quality from the Ozo is terrible —
        Though nobody who paid for one will say that heh !