You never turn down the opportunity to play with millions of dollars of VR equipment. When Eric Hodgson, 3D Visualization Director at Miami University in Oxford, OH invited me to come play with his toys in his Virtual Reality Lab, I couldn’t resist.
Eric Hodgson’s primary research relates to human spatial ability, including spatial memory, perception, reference frames, landmark use, and navigation. Because this work often includes the construction and use of virtual environments for people to navigate within, he also conducts applied research related to immersive virtual reality and motion tracking. Most recently, this work has involved developing Redirected Walking techniques to expand the navigable area available for simulations.
We started the morning off by trying out their HIVE setup. HIVE (Huge Immersive Virtual Environment) allows the user to walk around a large open space with an HMD powered by a backback rig and laptop. With a dozen cameras mounted to the walls of a basketball court, and a dedicated computer to manage the positional tracking, I wondered around a virtual grocery store and then chased some rather large chickens.
The HIVE setup is powered by the WorldViz Vizard 3D software toolkit, WorldViz PPT H12 optical inertial hybrid wide-area tracking system, and the NVIS nVisor SX head-mounted display. This HMD provides about 111 degree FOV, which is just a bit more than the Oculus Rift DK1. Honestly, I didn’t notice much of an advantage. The demos I tried were created using the open-source game engine, Panda3D. We talked a bit about Unity, and it appears that they could also use that engine if they decided to switch down the road.
The NVIS VR headset has two 1280×1024 LCOS displays, which are driven by the VGA and DVI ports on the laptop. This laptop is running an NVIDIA Mobile Quadro GPU, so you can imagine the power being pushed out. At 60Hz and and a fairly low framerate, I can honestly say that the experience was less than perfect, but still very compelling due to the mobility. In case you were thinking about purchasing a setup like this, I discovered that the price tag was six digits!
Once we left the HIVE setup, we went to the VR Lab to play. I noticed a body suit lying in the corner and couldn’t help but laugh a bit. As a rather husky fellow, I really hope that the concept of a one piece body suit doesn’t catch on (Sorry, Star Trek fans!). This was the Xsens Tracking System, which uses Bluetooth technology to track inertial movement of the wearer. We decided to skip this demo since none of us wanted to try to squeeze into it. Apparently, the last person to wear it was a figure skater who was tracking her movement while performing her routine. Because the system only tracks movement and not absolute position, any time she was just gliding across the ice, it appeared that she had stopped completely.
We moved on to the zSpace System: a 3D monitor and a tracked pen allows the user to manipulate an object in 3D space. You wear polarized glasses which are also tracked by the system, so as your head moves, so does the image. With this level of tracking, and the pen input, I can see how this technology can have amazing applications for the medical, educational, and industrial fields. I spent quite a bit of time dissecting a model of the inner ear, as well as playing a 3D Asteroids game. It takes a few rounds to get used to the game, but it quickly became addictive.
Then there was the VR Lab’s CAVE. CAVE is actually a recursive acronym (CAVE Automatic Virtual Environment) and is an immersive virtual reality environment where projectors are directed to three, four, five or six of the walls of a room-sized cube (our friend Oliver Kreylos gives a good overview of how a cave works). The name is also a reference to the allegory of the Cave in Plato’s Republic where a philosopher contemplates perception, reality, and illusion.
I was amazed at how much space this setup took. Huge blacked out boxes housed the projectors, and each wall was about eight feet squared. A rack of NVIDIA Quadro powered workstations power the CAVE and connects to it via a twisted mess of wires. I was told that this system is a few years old, and that newer, smaller options are now available. With a million dollar price tag, I was ready for this system to wow me.
…and it did. The first demo I tried put me into a Romanesque room with a huge model of a jet engine floating in front of me. Used for educational and industrial design purposes, this demo allowed me to tear apart and fully inspect the engine, and even spin up the turbine with my controller. I immediately saw an advantage to this system since I had full presence in the scene. I was able to see my entire body, but was also surrounded by a virtual world.
The second demo allowed me to fly around a very accurate reconstruction of a village in Dubai. Again, the CAVE gave me the unique experience of having a larger feeling of presence than I was used to with an HMD that prevents me from seeing myself.
I am very excited to be able to share my experiences with other VR enthusiasts. Some of this technology will be available for consumption at the Cincinnati/Dayton Virtual Reality Meetup on February 28, 2014 from 6-9PM at Miami University. Be sure to check out the Meetup.com page for more details.
Come for the Rift, and stay for the CAVE.