Community Q & A
When can we expect a public release of VR SLI capable drivers?
Within the next few months. (However, drivers by themselves won’t automatically enable VR SLI—it will require integration into each game engine, so support for it will be up to individual developers.)
Can you confirm what GPUs will receive what GameWorks VR features i.e. is it restricted to the 9xx series?
Almost all of the features work all the way back to the 6xx series! The only one that requires the 9xx series is multi-resolution shading, because that depends on the multi-projection hardware that only exists in Maxwell, our latest GPU architecture. For the professional users out there, GameWorks VR is also fully supported on our corresponding Quadro cards. Of course, VR rendering is pretty demanding, so we recommend GeForce GTX 970 or higher performance for folks looking to get their PCs ready for VR.
Can you comment on the recently uncovered patent applications RE: an NVIDIA VR headset? (link)
NVIDIA regularly files patents across a wide range of graphics and display-related fields.
Can you explain the differences between Gameworks VR and VR Direct – will one replace the other?
GameWorks VR is the new name for what we previously called VR Direct. As we’ve grown the feature set of our VR SDK, it made sense to roll the capabilities into our overall GameWorks initiative.
A lot of people are banking on SLI working in VR as a solution to the extremely high performance requirements. What are the chances of you getting one-GPU-per-eye latency down to the same levels as single card setups?
We already have! In our VR SLI demo app we can see one-GPU-per-eye working without increasing latency at all relative to a single-GPU system.
Any plans for GameWorks VR features for Linux operating systems?
Eventually yes, but that’s further down the road.
Regarding Foveated Rendering and VR SLI: do you have plans to enable SLI with two or more graphics cards or GPUs from different performance sectors? Like an enthusiast GM200/GP100 for the center area and a lower specced GM206/GPxxx for the peripheral vision? Any plans for a special VR dual GPU card that works like this?
SLI requires all the GPUs to be the exact same model, so no, this isn’t something we can do currently. Heterogeneous multi-GPU (i.e. different classes of GPU working together) is potentially interesting as a research project, but not something we are actively looking at right now.
What are your long term plans to increase GPU utilization without increasing latency?
Not sure I understand this question exactly, but we’re constantly making driver improvements so that we can keep the GPU fed with work and not have to wait for the CPU unnecessarily, and we’re improving multitasking and preemption support with each new GPU architecture as well.
Any plans for LOD based rendering being stereoscopic close and monoscopic after a certain distance?
This is up to the individual game developer to implement if it makes sense for them.
I see you have added monitor detection for Headsets, do you intend to extend this to Plug and Play VR Setup, so when you plug the headset in, its resolution and capabilities can be passed down like plug and play?
As mentioned, we’re implementing Direct Mode so that headsets are recognized and treated as such instead of as a desktop monitor, but beyond that it’s up to the runtime/drivers provided by the headset maker to communicate with applications about the headset’s capabilities.
How big is the VR R&D team at NVIDIA?
It’s a bit difficult to say because there isn’t really one “VR team”, there are people focusing on VR across nearly every organization in the company. It’s a big initiative for us.
Can NVIDIA make a driver better than Morgan Freeman in Driving Miss Daisy?
It’s a huge challenge to make a driver as smooth as Morgan Freeman, but we’re sure as hell going to try!
Our thanks to Nathan Reed for taking the time to work on this guest piece for us and to the users of subreddit /r/oculus for providing the questions.