Oculus Story Studio, the company’s in-house content creation studio that has produced a number of highly polished experiences, recently released a surprising ‘tell all’ blog post detailing just what sort of tools the team of VR content creators use.
Story Studio has the funds, and a team of ex-Pixar artists to boot, but just what tools made experiences like Lost and their newest family-friendly short, Henry, a reality might surprise you. “On the whole, we do not have a lot of custom code in our pipeline,” writes Max Planck, technical lead at Story Studio. “This may change in the future, but we are trying to use out-of-the-box tools as much as possible and show the world what anyone can do with commercially available software.”
Planck includes everything from hardware specs to specific software used along the studio’s production pipeline.
Hardware
- Falcon Northwest Talon V desktop with 32 GB RAM, an Intel i7 processor, and a single NVIDIA 980 graphics card with Windows 7 as the OS.
- To approximate audio quality on the Oculus Crescent Bay headset, all audio is mixed using a Beyerdynamic DT 770 Pro or an Audio-Technica ATH-M50x headphones
Software
- Adobe Photoshop CC, Adobe Flash Professional CC and Adobe Premiere Pro CC for cutting the rough story boarding and preliminary audio
- Unity 4 game engine for prototyping 3D storyboards
- Unreal Engine 4 for the real-time engine
- Maya 2016 computer graphics software for architecture, props and character rigs
- Mudbox 3D sculpting and painting tool for set sculpting
- Zbrush digital sculpting tool for character sculpting
- Houdini FX, the 3D animation application software, with its Maya Houdini Engine integration for generating complex animating geometry, like smoke, fire and cloth simulation
- Substance Painter texture painting tool and Mudbox for texture painting
- Quixel NDO texture creation tool for normal map painting
- Wwise audio authoring tool and Oculus Audio SDK to spatialize ambient and actor driven sound
One of the few custom plugins developed by Story Studio is an application dubbed ‘Stage Manager’, a plugin that controls when each Matinee Actor, or main character using The Matinee animation tool, plays and which Actors should be visible depending on which story beat is cued. Planck writes that as Oculus moves forward with refining ‘Stage Manager’ that they “…hope to release it as a UE4 plugin when the evolution feels stable.”
When all is said and done, the team packages everything into a Unreal Engine 4 executable for later replay, which for now has been relegated to special viewings organized by Oculus on their Crescent Bay prototype.
See Also: Oculus Story Studio Promo Video Reveals 5 VR Short Films in the Works
There are however still some sore spots that need refining in the development pipeline. Like many developers currently building for VR, there’s a burning need to create VR experiences within virtual reality, and not on traditional monitors.
“We’ve seen over and over again that we can make a scene look great in the windowed editor but when that work is integrated into VR, we find that we would have done things differently if we were authoring while feeling present. We’re looking forward to having more VR authoring tools to speed up iteration and are hoping to push inspired tool builders in this direction.”
Getting intuitive VR creation tools to developers seems to be the next step in creating VR experiences, as elements like scale and texture change noticeably going from flatscreen to VR. If you, or anyone you know is working on such a program, maybe shooting Story Studio an email wouldn’t be such a bad idea!