Samsung revealed their Gear VR headset this morning in their IFA Unpacked press presentation in Berlin. If you weren’t able to catch the broadcast, the 10 minute Gear VR segment, with appearance by Oculus CTO John Carmack, is right here.
Gear VR, which is ‘Powered by Oculus’ is a “deep collaboration” between Samsung and Oculus VR, according to the companies.
John Carmack is a legendary programmer, well known for his pioneering work in the early days of 3D graphics and networked multiplayer gaming. Carmack joined Oculus as CTO about a year ago and has been spearheading the company’s efforts on mobile VR.
Carmack took to the Samsung stage to discuss some of the important work that Oculus has done to achieve a high quality VR experience on the Galaxy Note 4 using Samsung’s Gear VR headset. His segment is transcribed by Road to VR here:
“I’m really excited to finally be able to talk about this, it’s been a secret project for too long. So I’ve dedicated the last year of my life to making the best mobile VR system possible and I’m really proud of what we’ve been able to accomplish so far.
So completely mobile VR is a magical thing, you can pick it up and take it with you, you can put it on then turn completely 360 degrees in the virtual world but its a hard problem to do well and we need to bring a lot of different technologies to bear on this.
So one of the biggest ones is the Samsung super-AMOLED display. Unlike an LCD panel, which can take 10 or 20 miliseconds to change pixel state, the super-AMOLEDs can turn on and off almost instantly. This lets us light up a pixel and then turn it off a fraction of a frame later which lets us kill motion blur. Now these screens are such high resolution that there was a lot of skepticism about what level of graphics you would be able to render at the consistent rates necessary for virtual reality, but Samsung’s gotten us very low-level access to the hardware platform here and this has enabled us to develop an innovative software architecture that can continue to maintain smooth accurate updates from the head tracking even when the system is actually overloaded.
So the critical path in virtual reality is the time that it takes from some motion of your head until the time that updated light from the display hits your eyes. This is the motion-to-photons latency. It starts with the sensor inside the head mounted display—now [the sensor is] similar to what’s inside [current smartphones] but it updates at five times the rate and it’s calibrated to a much higher standard of accuracy.
Now unlike a typical touch event that winds through the system through all sorts of different layers we have a custom kernel-driver that our sensor talks to and then talks to our apps, cutting out a lot of middle-men along the way. And our applications are run with the real-time scheduling priority. This means that if you’re email client decides to try to make an update in the middle while you’re playing, it can’t preempt virtual reality threads which is very important.
We also have guarenteed clock rates which is unusual in the mobile world where clock rates are usually constantly fluctuated to optimize different things. But now developers can choose a specific rate and then optimize for it.
Our graphics are being drawn with multiple context-prioritized GPU threads. This means that the main world can be drawn at variable rates while a second higher-priority thread is updating the screen in very quick response to the headtracking inputs. This is very, very important for our critical… in closing that loop of… how you move to what you see. Now the drawing is done to a completely unbuffered window, there’s no page-flipping going on here, we’re racing the raster drawing just ahead of the scan that’s pulling it out to the super-AMOLED display. Then the display turns on and then turns off again quickly to kill the motion blur; the photons jump off the screen, bend through the lenses, make their way to your eye and hopefully you say ‘wow, this is really impressive’.
Now this is just the beginning. We have active technology work going on in all the different aspects here and we’re expecting to make continuous improvements as we go forward but this is really a landmark first step.”