NVIDIA Demonstrates Experimental “Zero Latency” Display Running at 1,700Hz

49

At GTC 2016 this week, NVIDIA’s Vice President of Graphics Research demonstrated a novel prototype display running at an incredibly high refresh rate, all but eliminating perceptible latency.

Update (4/6/16, 11:08AM PT): An earlier version of this story transposed ‘17,000Hz’ in place of the correct 1,700Hz.

When it comes to latency in virtual reality, every part of the pipeline from input to display is a factor. You’ll often hear the phrase ‘motion to photons latency’ which describes the lag between the instant you move your head to the moment that the display responds to that movement. Between those two points are several sources of latency, from the detection of the input itself, to the rendering, to the time it takes for the display to illuminate its pixels.

For desktop-class VR, current state of the art VR headsets have displays running at 90Hz, which means that they’re capable of showing 90 images per second. And while we’ve seen that 90Hz is more than sufficient for a comfortable VR experience, NVIDIA Vice President of Research David Luebke says that ever higher refresh rates could improve the VR experience by further reducing latency.

David-Luebke-gtc-2016
NVIDIA Vice President of Research David Luebke at GTC 2016

At GTC 2016 this week, Luebke demonstrated an experimental display with a refresh rate that’s almost 20 times faster what we see in current consumer head mounted displays. Running at a whopping 1,7000Hz, the display was mounted on a rail system which allowed it to be rapidly moved back and forth. When shaken vigorously, the image on the display stayed locked in place to an impressive degree. Even when magnified closely, the image on the screen seemed entirely fixed in place.

SEE ALSO
Mixed Reality Flight Sims Are Accelerating F-16 Pilot Training in Ukraine

A 90Hz display shows an image every 11 milliseconds, while this 1,700Hz display shows an image every 0.58 milliseconds.

“…if you can apply this to a VR display, that kind of ultra-low latency would help things stay rock-solid in the environment, to the point that the display would no longer be a source of latency. So this is effectively a zero latency display,” said Luebke.

One thing I find particularly interesting about this (and all VR displays in general) is that while the object on the screen appeared to be fixed in space to our eye, in reality, the image is racing back and forth across the display, illuminating many different pixels across the screen as it goes. The illusion that it’s still is actually evidence of how quickly it can move, which is curiously counterintuitive.

“You could put this thing in a paint shaker and it would appear to stay solid… it’s very cool,” Luebke said.

Of course, for this level of tracking, you also need extremely low latency input. Thus a second reason for the rail system is revealed; Luebke told me that wheels on the rails feed the movement of the carriage almost instantaneously into the system. Without such precision and low latency input, even a display this fast as the one demonstrated wouldn’t appear to show such a steady image, highlighting the need for low latency across the entire ‘motion to photons’ pipeline.

While less than 20ms of latency from input to display is generally considered good enough for VR, Luebke said that things get better toward 10ms and there’s even measurable benefits down to as low as 1ms of latency.

SEE ALSO
Quest 2 and Quest Pro are Being Discontinued, Encouraging Wider Adoption of Mixed Reality

Until we can brute-force our way to zero latency with super high refresh rates like Luebke’s demonstration, a technique called low persistence is employed by modern VR headsets to capture some of the benefits of a super fast display, namely blur reduction. Low persistence works by illuminating the display only briefly, then turning it off until the next frame is ready (rather than keeping it illuminated continuously from one frame to the next).


Road to VR is a proud media partner of GTC 2016

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Sam Illingworth

    I’m not clear what the benefit is – surely our displays and motion sensors already update faster than we can render new frames, so how does this help? To put it another way, how does a 200x faster screen help when that just means displayer the same rendered frame 200 times in a row? I suppose you could use timewarp to get some benefit out of it, but is it really worth it without the corresponding rendering power?

    • Reprojection is cheap. Effectively reducing the display’s contribution to the latency to zero is advantageous. You *never* want to draw the same frame twice as his produces persistence blur and looks like muddled garbage. It’s not 17000 Hz, it’s 1700 Hz.

      • Bryan Ischo

        I don’t buy it. Your eyes and brain are not perfect fidelity input and processing mechanisms. There is a “good enough” for latency and I thought we were already there at 90 hz.

        Maybe things could get a little better, but advancements in this area are utterly unimportant when compared to resolution and FOV.

        • There’s good enough for movies to. About 24 Hz. But it’s clearly suboptimal.

          There’s good enough for 3D games; 30 FPS. But it’s clearly not ideal.

          There’s good enough for music; 128 kbps .mp3s. But .flac is clearly better.

          I can see the difference between 120 Hz and 160 Hz easily on a monitor. 90 Hz is just barely enough to keep people from projectile vomiting in VR.

          Without low persistence, the persistence blur at 90 Hz is extreme (see any LCD without strobing in any fast game). Hence all VR displays use low persistence panels. This is a clever hack so that you can get a crisp image when your eyes follow a fast moving object. But it is clearly just a clever hack, as when you focus on something and something else moves, it leaves a trace of discrete, sharp images on your retina. That motion blur carries information about the motion of objects; it doesn’t look quite right without it.

          Estimates on the framerates required to do this really well without cheap hacks is roughly 1000 Hz (see e.g. Michael Abrash’s post on the subject: http://blogs.valvesoftware.com/abrash/down-the-vr-rabbit-hole-fixing-judder/ ). He works at Oculus now, so you can safely assume that Oculus is experimenting with higher framerates as well.

          How does one affect the other? OLED can switch at frequencies comparable to the clock frequence of a processor from the 1990’s. There is nothing special being done to the pixels here that jeopardizes FOV or resolution.

          • Bryan Ischo

            90 Hz is more than enough to keep people from projectile vomiting in VR. 75 Hz is sufficient; the DK2 is proof of that.

            While I agree that lower latency and higher framerates are desirable, when I play a VR game they’re not what I’m constantly wishing for. What I’m constantly wishing for is better FOV and higher resolution. Which is why I say that framerate/latency, at least for me, is utterly unimportant when compared to those other issues. This itself is not meant as ridicule to those people who are working on improving frame rate & latency. If that’s what they are interested in, then by all means, please improve those too.

          • The DK2 is proof that 75 Hz is insufficient to keep some people from becomming sick.

          • realtrisk

            The fact that people get simulator sickness when seeing motion but not feeling it has nothing to do with frame rate. 10,000 fps would still cause someone who is susceptible to simulator sickness to get ill. This is not a valid argument. You are acting as if the only thing keeping people from getting sick in VR is a high frame rate, and that’s just not true. High frame rates only solve one small part of the equation for someone who is truly susceptible to sim-sickness. Until the inner ear can be tricked, this will remain a huge issue.

          • It’s just a fact that lowering the framerate makes some people seriously ill in VR when just turning their head around; no weird accelerations, no unnatural spinning, just standing and looking around. 75 Hz is not enough. 90 Hz is ‘good enough for most people’, without insane hardware demands. Do not think that they chose 90 Hz because it’s the best and there is slim to no advantage going to 180 Hz.

            Framerates solve a huge part of the latency problem. There’s a large subset who do *not* get sick from weird spinning and acceleration that feel queezy with low framerates.

            We’ll soon be talking hundreds of Hz for generation 2 and 3 HMDs because foveated rendering solves the performance issue even at high resolutions and reprojection from 90 Hz to 180 Hz is much more accurate than reprojection from 45 Hz to 90 Hz.

          • Bryan Ischo

            Can you point to some examples? Because I have *never* read about *anyone* getting ill from DK2 frame rates, in the hundreds/thousands of DK2 experiences I have read about.

          • Bryan Ischo

            “90 Hz is just barely enough to keep people from projectile vomiting in VR.”

            “The DK2 is proof that 75 Hz is insufficient to keep some people from becomming sick.”

            What you originally said was wrong. 90 hz is not just barely enough to keep people from projectile vomiting in VR. 75 Hz was fine for 99.9% of DK2 users. In fact I never heard of *anyone* getting sick using the DK2 because of refresh rate. Uncomfortable mismatches between movement in game and movement out of game are what causes it.

            There are so many bigger fish to fry in VR than refresh rate and latency.

          • Adrian Todd

            You may see a difference between a 120hz monitor and a 160hz monitor, but that’s simply because the refresh rate on either of those doesn’t divide cleanly into the human eye’s rate of capture. Both of those rates are higher than the human eye can process, but can still be noticeable because of that.
            And actual motion blur is remarkably similar to the low persistence technique used, it’s essentially your brain creating motion tweening when the eye’s capture rate can’t keep up. We’re not eagles, and our eyes actually have a fairly low rate of image capture, so syncing to that is far more important than just upping the frequency, although it does become easier at higher frequencies. Still, 1000hz is well over a ten times the human eye’s capture rate, but still isn’t a clean sync, which is far more important in making an experience seem natural.

          • Human eyes don’t have a refresh rate or “rate of capture”. There is persistence in the eye that allows even a strobe light at somewhere around 75 – 90 Hz to remain flicker free for most people.

            You see the difference between 120 Hz and 160 Hz because the discrete steps of animation are super-imposed by the persistence of vision. At lower framerates this is cleanly visible, an obvious artifact similar in nature to the stair stepping of aliasing, where higher framerates more closely approximate motion blur. You see this effect into the hundreds of Hz with a strobe easily.

            It offers noticably and obviously smoother motion and it looks more natural (the real world is not illuminated by a strobe light).

            Motion blur without high framerates does not seem possible to implement cleanly. If you follow an object with your eyes, it should never be blurred regardless of its movement on the screen. Eye tracking may never get good enough to deal with this. I suspect higher framerates will be the low tech, easy fix.

            Motion blur does seem to carry information about the speed and direction of travel of objects that the brain has difficulty extrapolating from tracking the motion of many objects on a series of clean frames.

          • Sam Illingworth

            If the cheap hacks work, then why do we need to do without them?

          • Because excellent is better than merely tolerable.

          • Mark Rejhon

            Very good reply! I’ve recently written about the discovered mathematics of motion blur — it’s called the “Blur Busters Law”: 1ms of persistence translates to 1 pixel of motion blurring per 1000 pixels/sec.

            The territory of 1000Hz+ is quite spot on, as I’ve now written an article about this (“Blur Busters Law: The Amazing Journey To Future 1000Hz+ Displays”). For retina 180-degree VR, the upper limit of diminishing returns is closer to 10,000Hz, though.

            Today, you can get 0.5ms persistence with ULMB displays (ULMB Pulse Width 25%). Blur Busters was responsible for convincing NVIDIA to add that menu option to G-SYNC monitors with ULMB, to adjust the strobe pulse width — making them adjustable-persistence displays.

            In the TestUFO Panning Map Test at 3000 pixels/sec, map labels are much more readable at 0.5ms persistence although the screen is much darker.

          • Are you connected with the blurbusters people? The blurbusters site is the gold standard for people who care about persistence blur, I’ve been using it ever since my CRT died in 2011.

            Was stuck with some lousy hand-me-down 60 Hz TN-panel that was ridiculously awful until I got a 144 Hz panel.

            That stereo 3D thing that the industry wanted to hype up (as opposed to VR, where the hype is legitimate and grass roots) was the accident that created all these 120 and 144 Hz panels on the market that could be repurposed for “gaming displays”.

          • Yes, I am Chief Blur Buster.

            I recently wrote a Holiday 2017 feature article about the long-term journey to future strobe-free blur reduction technologies (blurless sample-and-hold) — low persistence without impulsing tricks — which requires the league of 1000Hz+ and prototype displays in the lab now exist with this refresh rate territory.

        • Eddie Offermann

          Frame rate and latency are related – but they are not the same thing. 90hz is a pretty decent frame rate, but the corresponding 0-11ms delay is not the ideal threshold for latency. The goal with higher frame rates in this regard is not to put more frames of animation in front of you but rather to reduce the motion-to-photon latency. 1ms latency would be great, below that is gravy. Actual framerate can vary, then, as long as latency remains low.

          • Bryan Ischo

            That is a good point. However, I have never heard anyone say that the latency is perceptible at 90 hz, so I thought also that latency is already “good enough” too.

          • 24 Hz movies are good enough. They allow you to sync the sound to the video (just barely). 30 FPS is good enough for 3d games, just ask Sony, Microsoft or Ubisoft; it’s “cinematic”. Etc.

            90 Hz is clearly not ideal. It’s just good enough. A higher framerate is clearly and noticeably better. You can distinguish 120 Hz from 160 hz from 200 Hz on a CRT quite easily(200 Hz may damage the display, vertical scan rate is higher than CRTs are rated for but some will happily do it for years).

          • Bryan Ischo

            Are you disagreeing with my statement that, while frame rate/latency can be improved, they are of much less importance than resolution and FOV for improving VR experiences at the moment?

          • Resolution is a solved problem. You can’t have it until you have eye-tracking and foveated rendering.

            Eye-tracking and foveated rendering seem to have been solved well enough, but not early enough to make it into gen 1 HMDs. It’s a given in gen 2 HMDs.

            When you do have foveated rendering the obvious first step is to increase the resolution to about 3k by 3k per eye (higher will require more bandwidth than two display port 1.3s can offer for descent refresh rates).

            The obvious next step is to move composition into the HMD and put a very high PPI display (>2000 ppi has been demonstrated years ago with OLED and needs some time to be commercialized). Something like 8k by 8k per eye (effectively 16k by 8k) makes about the same demands on the hardware as 2560×1440 on a regular display; foveated rendering is a factor ~100 performance savings at that resolution.

            FOV is also a known quantity. you can’t have more FOV with 2 displays and conventional rasterization. High FOV is a form of anti-foveated rendering where the center of the screen gets proportionately fewer pixels and far more pixels are spent on the periphery. The wider the FOV, the worse it gets.

            The obvious, complex and expensive solution is add another, lower resolution screen for the periphery and render the scene 4 times. Use different optics and spend years making the seam between the displays sufficiently invisible and the HMD sufficiently light.

            Less obvious solutions, like having a very low res peripheral display with a very smooth gradient and raytracing a few thousand pixels might be a thing, but you’ll spot it if you move your fovea to far from the center.

            I’m not holding my breath for larger FOV.

          • brandon9271

            StarVR has a FOV of 210. That with foveated rendering should be plenty, right? I mean, I would be thrilled to have that big of a FOV with each gen having higher res.

          • The problems with high FOV are well known and expounded on to great detail elswhere (e.g. size, weight, pupil swimming, more render targets to deal with low pixel density where it matters, wasted pixels ). The high FOV in starVR involves large tradeoffs that may not be worth it.

            Regular, flat displays are just a stop gap. They’re used not because they are a good solution, but because phones made them cheap. They’ll be around for a while. Then the HMD market will approach the size of the phone market and get big enough to justify the multi-billion dollar R&D and factories to scale up VRDs and digital light field displays.

            VRDs are well suited to do digital light fields (solves the vergence accomodation problem), they have very low energy demands (matters if it’s portable and matters if heat/sweat is a problem; well suited to do HDR), have perfect fill factors (absolutely no screen door of any kind).

        • Mexor

          Human vision is a very complicated issue and when scientists come out and give information to the public they aren’t giving us all the details. I can imagine that it’s entirely possible for 90 Hz to be good enough for the brain to accept tracking an external object across the retina, but not be good enough for the brain’s interplay between the ocular and vestibular systems.

          Our vision is highly processed by our brain, and “what one can see” is strongly related to “what the brain processes/estimates” and “what the brain is willing to accept”, not necessarily the physical limits of the underlying detector. Therefore it’s easy to imagine that in some situations (particular where multiple inputs are simultaneously being considered) degree of sensitivity is vastly greater than in other situations.

      • Simon Wood

        This looked like it was just movement (translation) in one dimension. It would be _really_ impressive if they were re-projecting the whole view as a portion of a 360′ sphere – ie. as the headset turned/pitched/yawed.

    • Mexor

      It’s not about scene accuracy, it’s about motion sickness. The brain expects the “world” to flash across the retina in tight correlation to the movement of the head. When it doesn’t it tends to make people motion sick. A new frame doesn’t need to be rendered to fix the problem, the old one can be transformed the proper amount in tight correlation with the motion of the user. This would likely introduce blank spots coming into view as areas once offscreen now move into field of view. I am guessing they would deal with that by rendering to the display’s buffer a frame that is slightly bigger than the actual resolution of the display, and then having the display choose and display the appropriate subset of that buffer corresponding to the transformation calculation that takes user movement since frame rendering into account. Such a calculation should be very fast compared to rendering a whole new frame.

      • Sam Illingworth

        Yeah, that’s the timewarp I referred to, but I haven’t heard any VR devs saying “the problem is we can’t do timewarp fast enough”.

        • Mexor

          Because they can do time warps as fast as the current displays allow. The problem isn’t that they cant calculate the time warp fast enough it’s that there’s no reason to, as the displays can’t take advantage of it. I am not surprised you didn’t hear game developers say “We need 1700 Hz displays” because they probably had no idea it was in the realm of possibility. This is a novel approach, apparently. New research. Luebke mentions something called “binary delta-sigma modulation”, if I recall. I don’t know anything about signal processing or digital to analog conversion so I have no idea how it works.

          • Sam Illingworth

            Regardless of whether they thought faster timewarp was possible or not, what evidence do we have that it would help? I’ve seen nothing to suggest that maintaining 90hz isn’t good enough to be unnoticeable, certainly seems it to my eye.

          • Mexor

            I’ve never used an HMD myself, but reading about them, it seems the consensus is that there is an input lag problem and that 90Hz is not fast enough. Another problem with current generation HMDs is vergence-accommodation conflict. And a third is that the resolution is too low. A fourth is probably that the field of view is too narrow.

            All that isn’t to say that current generation HMDs aren’t practical or good, just that improvements to the experience can be made.

          • Sam Illingworth

            I haven’t heard or read anything to suggest that consensus. All I’ve heard suggests 90hz is fine (it seemed fine both times I tried it, but that’s anecdotal), the problem is when it drops below 90, which is renderer limited, not display limited.

            I agree higher resolution and greater FOV will be big improvements (though they’re far less detrimental than I was expecting). I don’t know what a vergence accommodation conflict is, but if it’s anything like a vergence in the Force it can’t be good.

            Wireless would be the biggest improvement, mind you, so it’s wireless technology that needs the biggest development effort if you ask me. I don’t mind wearing a battery in a backpack or on a belt.

  • I’ve re-listened to it a few times now, it sounds like “seventeen hundred hertz” which would be 1700 Hz :P That’s what it sounds like to me anyway, which would still be impressive!

    • John Horn

      Yep. 1700 hertz. It’s still impressive, but Road to VR should update the article. :)

    • benz145

      You’re definitely right, silly error on my part. Fixed!

      • TC_Orygun

        You still have 17,000 Hz third paragraph first sentence.

      • brandon9271

        Still a typo in there “Running at a whopping 1,7000Hz”

  • Now I would like to see CPU, GPU and whole inputoutpu system that renders frame at such a low latency XD Last time I check games needs at least 16 ms to render a frame, plus other things add more latency. I don’t see how games will benefit from this “zero latency” display if its still bottlenecked in other places.

    • Eddie Offermann

      Check again – 16ms is above what we need for mobile frame rates for cardboard/GearVR style games. A lot has changed in the last year or so because of VR. Vive/rift/etc currently aim for 10-11ms max – and in both mobile and tethered cases that’s because that’s brushing against refresh thresholds. That’s not to say that experiences always hit that level – but below that tends to generate user complaints. 16ms is a max limit not a bare minimum.

      That’s of course why there’s such concern over the number of users who *don’t* have VR-ready machines and why Valve is working with Unity to provide sensible fallback behaviors that sacrifice MSAA, SSAO or other enhancements (automatically) rather than drop frame rates – because frame rates aren’t negotiable in VR.

    • crim3

      Mind you that whatever we use today is the result of previous research, sometimes for decades. What they are saying here is that there is research in very high refresh rates displays. It’s not like next year they are releasing a 1000Hz panel.
      Consumer electronics will keep envolving at its own pace.
      When I was young a good CRT could go up to 140Hz (160Hz maybe? I can’t remember). That means a frame every 7 ms, in the 90’s.

      • Mads Furnes Kristoffersen

        I have a old CRT which could and can still do 200hz at 640×480, albeit a very low resolution.

  • Rob B

    One area of research in which this is useful right now, is where you need to render multiple depths sequentially.

    Even if the PC only generated a scene at 90 fps (including full depth buffer), then
    that equates to 1700 / 90 = 18 levels of depth rendered per scene.

  • TC_Orygun

    “the display was mounted on a rail system which allowed it to be rapidly moved back and forth.”
    This just seems like a circus trick to me. Does it move any direction of just left and right? How can they claim lower latency when they are just moving the display? What am I missing here?

    • Mexor

      It’s an HMD. That means they are simulating the tracking of the motion of the viewer’s head and updating the display to accurately reflect the current position. The reason they keep the image stable is for demonstration purposes. Any instability means that there is latency between the movement of the head and the display on the screen. Such latency causes motion sickness. If the image remains stable, all is well. As the article says, if they didn’t center their point of view on the object for demonstration purposes, the object would be racing across the screen as the HMD is shaken. It’s simply much easier to visually verify a little bit of motion versus something static in comparison to trying to verify perturbations from an expected motion. In fact since each viewer’s head in the audience isn’t actually doing the swivelling the brain would probably accept whatever it sees as simply being the arbitrary motion of an external object, and how could the audience acurately determine what the HMD is actually doing on its rails?

  • Horror Vacui

    This is unfortunately useless. The eyes and the brain are already struggling to see flashing images at more than 120hz because the eye have pass-filters that only pick-up on moving objects to be processed (and blur is added to make sense of it) while screen are a uniform flashing image that can easily give you eyestrain and even headache about 120hz.

    So I can’t imagine why we would need a 1700hz display if not for scientific application (that I can’t even imagine either but surely there are)

    • AverageReader

      WTF are you talking about? When playing games I have way more eyestrain when playing on my 60hz LCD panel, compared to my coworkers 144hz LCD panel.
      I’ve had the chance to use an overclocked display running at 160hz, but that was only for a few minutes, and honestly I couldn’t tell the difference between 144 and 160. But it was smooth,

    • Albert Walzer

      this is not about how many pictures per second you can resolve. This is about how far behind the graphics “drag” behind your head movement. And this is something where our body is way more sensitive than 120 Hz.

  • Alexander Stohr

    i can still see this character sequence in the text: “1,7000Hz”
    don’t you want only three digits on the right side of a comma in sequence? drop one of the zeros!

  • Mads Furnes Kristoffersen

    I wonder if the human eye is operating at low persistence or if it’s persistence at a very high update frequency