The Feel
Spring 2016, SteamVR + HTC Vive Pre
Denny, our CEO and creative director, was pushing for more experimentation with our artificial locomotion options. We played with different settings, from uni-directional (forward only), to bi-directional (forward-backward), to strafing enabled (side-to-side). These methods were still causing sickness, but Denny was working toward vection mitigation when he got to the idea of a porthole.
During artificial movement, we would bring a mask around the periphery of the eyes—a portal of sorts that we called a ‘Vection Portal’. With it, you could feel motion as you moved through the level, but without the fast-moving peripheral motion that made it uncomfortable.
Despite these new comfort options for artificial locomotion, nausea wasn’t totally solved, and Blink was still the right feel for the game. Blink married The Gallery’s cinematic style, it was comfortable for mass adoption, and it inspired players to move around within their physical space.
In the end, we had to choose one. We were a small team, and testing large VR levels with multiple locomotion methods wasn’t in the cards. We had to fully integrate and support one method, and we decided to stick with Blink.
From there I was able to optimize the game’s performance entirely around Blink. I could instantly bring in specific items based on the player Blinking into certain zones, and then cull the others. We were able to cut down processing and increase the fidelity of the game because fewer areas needed to be persistent at all times.
When the first episode of The Gallery finished and public adoption of VR finally arrived in April 2016, the take on Blink was increasingly polarized. Teleportation had become the standard, but the rotational and persistent bounds features we added with Blink felt too insular. Valve had made simultaneous developments to their teleportation, and player preference leaned toward those minimalist, built-in systems. We had inadvertently invented many of the same wheels and our systems ultimately overlapped.
When the Oculus Touch controllers released in December 2016, we updated Blink to be more in-line with the standards of the time. Rather than aiming with your head (literally looking to where you wanted to go) as we originally had it, we set the default to cast from the hand, so you could point to your destination instead. We also added a ballistics trajectory arc that felt more intuitive to our players, albeit less connected to our world.
The feeling mattered, but so did the options.
Free Locomotion
Winter 2016, Oculus Rift CV1 + Oculus Touch
Artificial locomotion was taking on a new name: ‘Free Locomotion’. Onward—a tactical FPS fully-integrated with smooth, artificial locomotion—had become a cult hit with VR gamers. Some players felt that Free Locomotion was now the only way they could feel truly immersed. There was an outcry from the community any time a game lacked the option, and we would ultimately add support in the second episode of The Gallery, despite it being built for our default option, Blink.
Also gaining traction at the time was ArmSwinger. What I immediately liked about the ArmSwinger demo I saw was that the controllers inferred the player’s body direction by the motions they performed. You could turn the body with the direction of the arms, and still allow the head to pan decoupled from that direction. The downside to ArmSwinger however, was that it still used up a button and didn’t ultimately address vection issues.
Valve had let us in on another secret: because of the depth of our hand functionality, The Gallery was used to reveal the first prototype of the SteamVR ‘Knuckles’ controllers at Steam Dev Days 2016. These new controllers were trending away from buttons and the abstraction they produced; hand interactions could be performed by tracking of individual fingers, rather than with binary button inputs. With this in mind, I decided to augment the ArmSwinger mechanic to be buttonless.
I began researching FFTs, or ‘fast Fourier transforms’. I had studied them before for audio, but I was beginning to see their relevance in kinematics. The transforms can extract any number of combined waveforms within a time domain into an ordered table of discrete frequencies and amplitudes. More colloquially, the FFTs let me read body noise from motions of controllers and turn it into practical data.
I could then set the exact threshold of when movement should start; two periods that are cached or buffered can tell me when a player is running in spot based on their arm frequencies. With this method, you could swing one arm and pick an object up with the other while still continuing to move.
With body pucks (such as the Vive Tracker) not yet available to the public, full-body persistence would require us to infer the root position of the player. Once there, however, you could use height and leg data to start tailoring leg strength coefficient and calculating the specific physics and kinematics of any single person. Players could run at the same frequency beside each other, yet still have different travel distances.
In practice, games like Sprint Vector with its peripheral VFX have shown that ArmSwinger-like locomotion can help mitigate vection issues. The physical bobbing while you run is enough to ease into motion but still keep you grounded—there’s no lurching, and there’s limited vestibular disconnect. It’s a great option for free locomotion without sacrificing comfort. Plus, the cardio, you know?