At GDC 2016 we got to take a look at the latest STEM VR controllers from Sixense. With units purportedly on track to begin shipping next month, the company is also committing resources to create a promising cross-platform multiplayer experience called Siege VR which could turn into something much bigger than a mere demo.
Throughout STEM’s unfortunately troubled development timeline, one thing has surprised us: Sixense has always had highly functioning and fairly well polished demo experiences for their motion controllers. That probably comes with the territory; after all, the company designed the tech behind the Razer Hydra which hit the market years ahead of motion input controllers like Oculus Touch and the Vive controllers even having been announced. They also created the Portal 2: In Motion DLC which brought 20 new levels specifically built for motion controllers to the game.
So I suppose it shouldn’t be surprising after all that the many little tech demos they’ve made over the years to show off motion input mechanics have felt ahead of their time (see this one from 2014).
With that in mind, it was great news to hear at GDC 2016 last week that the company not only plans to finally ship the first STEM units to backers of their 2013 Kickstarter campaign, but they’re also developing a new game called Siege VR that’s going to have cross-platform support among headsets and motion controllers.
I Don’t Care What Platform It’s on, I Just Want to Play Siege VR
Unlike many of the short STEM experiences we’ve seen from Sixense over the years, Siege VR is more than a demo. What we saw at GDC was a prototype of what the company says will become a full game, which will be free to all backers of the STEM Kickstarter and will also be made available for the Oculus Rift, HTC Vive, and PlayStation VR (whether that’s using each platform’s own VR controllers or STEM).
Siege VR is a first-person multiplayer castle defense game which (in prototype form) had me and an ally wielding bows side-by-side, trying to stop hordes of enemies from reaching the castle gates. In addition to shooting regular arrows, there are two special arrow types: an explosive arrow (from the quiver seen on the left wall), which when ignited by a nearby torch explodes on impact; and an artillery arrow, which fires a smoke round which designates a target for your friendly catapult to fire upon. The former is great for taking out dangerous groups (including enemy archers who will aim for you directly) and the latter works against enemy catapults. The special arrow types regenerate over time, but you’ll want to use them sparingly, especially as the artillery arrow stockpile is shared between both players.
The game is still very early, but the creators say they’re considering having many more than two players all working together to defend the castle. Perhaps—they conjectured—there would be teammates at more forward towers who could aim back at the castle to prevent enemies from scaling the walls, while the players on the wall itself would be focused on enemies in the field. Maybe—it was suggested—it could be a multi-stage experience where, if the enemies break through the main gate, you and your team fall back to using melee weapons.
Some earlier prototypes included the ability to pour buckets of boiling oil onto would-be castle crashers, though that and some other features were cut for the time being to add a bit of simplicity and polish for the GDC demo.
Between the lot of us excitedly chattering about ‘what about [insert super cool gameplay]? Or how about [more super cool gameplay]?’ it was clear that Siege VR could have legs well beyond a simple demo, and that’s where Sixense says they plan to take it.
Forgetting It’s There is a Good Thing
As I played Siege VR using STEM with a Rift DK2, I got that wonderful feeling of forgetting about the technology and simply having fun playing the game. That means that everything was working together to make a fun and intuitive experience which kept me immersed. When I came out, the top of my mind was filled not with questions about STEM, but about the scope and potential of Siege VR.
STEM itself was integral to getting us to the stage of talking not about limitations, but about possibilities for Siege VR. I’ve used the system at many points along its oft-delayed development, and while it’s always felt good, this time around it felt better than at any point in the past; even after using Touch and Vive controllers all week throughout the rest of GDC.
For me the thing that pushed the needle most significantly was the headtracking performance. STEM has additional tracking modules which can be affixed to head and feet (or elsewhere, up to 10 tracked points). For their demos Sixense often eschews the Rift’s own headtracking in favor of using a STEM tracking module. Having used the Rift plenty, it always felt like there was something a little ‘off’ about the STEM-based headtracking—whether it was latency or positional accuracy, I’m not quite sure. But this time around I actually had to ask to clarify if they were using the Rift’s tracking camera or if it was STEM: it was 100% STEM.
I point to headtracking because it’s easier to tell when something isn’t right with your head, compared to your hands; light drift of a few millimeters or inaccuracy on your hands can be very hard to spot. When the placement of your virtual eyes depends entirely on the tracking though, it’s easy to feel when things aren’t working right. So what I’m saying is that because the headtracking was solid, that means the rest of STEM’s tracking is solid too (as there’s no difference in tracking a module on your head vs. a module on your foot).
Particularly in an electromagnetically dense setting—like, say, the middle of the GDC expo floor—which can mess with the magnetically-based tracking, it was impressive that the headtracking felt that good. In fact, Sixense’s booth had a number of STEM basestations scattered about; there was one just a few feet away from the one I was using, and I didn’t spot any interference-based tracking issues despite competing magnetic fields.
Sixense isn’t trying to quarantine itself from the competition either. They had both the Oculus Rift and HTC Vive (with Vive controllers) in action at their booth, and says their SixenseVR SDK will allow developers to create games that are ready for any motion controllers, not just STEM. The SDK allows for “code-free setup” for humanoid characters in Unity and Unreal Engine, and provides a full body skeletal pose based on sensor data.