Oculus Research Reveals New Multi-focal Display Tech

15

Oculus Research, the company’s R&D division, recently published a paper that goes deeper into their eye tracking-assisted, multi-focal display tech, detailing the creation of something they dub a “perceptual” testbed.

Current consumer VR headsets universally present the user with a single, fixed-focus display plane, something that creates what’s known in the field as the vergence-accommodation conflict; the user simply doesn’t have the ability to focus correctly due to the display’s inability to provide realistic focus cues, making for a less realistic and less comfortable viewing experience. You can read more about the vergence-accomodation conflict in our article on Oculus Research’s experimental focal surface display.

By display plane, we mean one slice in depth of the rendered scene. With accurate eye-tracking and the creation of several independent display planes, each taken from varying areas of the fore and background, you can mimic retinal blur. Oculus Research’s perceptual testbed goes a few steps further however.

image courtesy Oculus

The goal of the project, Oculus researchers say, is to provide a testbed in hopes of better understanding the computational demand and hardware accuracy of such a system, one that not only tracks the user’s gaze, but adjusts the multi-planar scene to correct for eye and head movement—something previous multifocal displays simply don’t account for.

“We wanted to improve the capability of accurately measuring the accommodation response and presenting the highest quality images possible,” says Research Scientist Kevin MacKenzie.

image courtesy Oculus

“It’s amazing to think that after many decades of research by very talented vision scientists, the question about how the eye’s focusing system is driven—and what stimulus it uses to optimize focus—is still not well delineated,” MacKenzie explains. “The most exciting part of the system build is in the number of experimental questions we can answer with it—questions that could only be answered with this level of integration between stimulus presentation and oculomotor measurement.”

SEE ALSO
Vision Pro Ultrawide Display Turns a Little MacBook into a Productivity Powerhouse

The team maintains their method of display is compatible with current GPU implementations, and achieves a “three-orders-of magnitude speedup over previous work.” This, they contend, will help pave the way to establish practical eye-tracking and rendering requirements for multifocal displays moving forward.

“The ability to prototype new hardware and software as well as measure perception and physiological responses of the viewer has opened not only new opportunities for product development, but also for advancing basic vision science,” adds Research Scientist Marina Zannoli. “This platform should help us better understand the role of optical blur in depth perception as well as uncover the underlying mechanisms that drive convergence and accommodation. These two areas of research will have a direct impact on our ability to create comfortable and immersive experiences in VR.”

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


Well before the first modern XR products hit the market, Scott recognized the potential of the technology and set out to understand and document its growth. He has been professionally reporting on the space for nearly a decade as Editor at Road to VR, authoring more than 4,000 articles on the topic. Scott brings that seasoned insight to his reporting from major industry events across the globe.
  • Walextheone

    This gives me hope for coming generation of VR hardware. One day we will be able to work from inside VR without getting “exhaustion” symptoms for our eyes.

    • I can’t wait for that day :)

      I stare at a fixed focal point monitor for 12 hours a day working. I am supposed to focus my eyes elsewhere every hour to stop fatigue, maybe simply looking down at the keyboard for a microsecond is enough time to do that or the periphery vision which is all different depths also helps reduce strain, I don’t know.
      But working in VR is one of my highest wishes.

      This technology that Oculus Research is perusing is very interesting.

  • Luke

    finally!

  • Flamerate1

    This is something that doesn’t seem to be talked about a lot, but it’s quite a big problem.

    If you go into VR and look at an object that is about a foot away from your face, you will notice that it’s pretty blurry. If you close one of your eyes and imagine that the object is far away, it will become clear.

    I noticed this one day and was wondering if there any talk about it, so I’m happy that there’s actual research trying to figure this out. I hope the system they’re using can be implemented into future VR headsets.

    • stew dean

      Look up light-field technology and what Magic Leap have been doing. They’ve been working on this for years. It is a big problem as currently VR head sets are effectively 2.5D as they don’t allow depth cueing by focus (which is how one eyed people gauge how far things are away).

      • Annette

        Gℴogle giving to people of every age $99 every hour to complete easy jobs from the comfort of home .. Labor Some only few hours daily & stay more time with your relatives … any individual can also avail this super post…on Sunday I purchased a great Fiat Panda after just getting $17520 this last five weeks .without any doubt it is the best work but you will not forgive yourself if you do not have a peek at this.!wf683u:➻➻➻ http://GoogleNetJobsStarCashJobsOpportunity/simple/work ♥♥y♥♥♥u♥p♥♥♥p♥l♥i♥♥h♥y♥♥♥a♥♥♥w♥♥a♥♥♥a♥v♥r♥♥♥h♥♥g♥g♥♥w♥♥q♥♥♥n♥u♥♥e♥a♥u♥♥♥t:::::!xg401z:wk

  • Fenris

    Looks like complementary to foveated rendering (wich may not be able to make focus as smart). Right now the required computing power still prevent this tech to be used for more demanding games… Wait and see.

    • Matt Clark

      He just said that their process is efficiently implemented on a GPU… it looks to me like they need to scale down the hardware to fit into a proper sized HMD

      • Raphael

        You think that test bench setup is too big for user heads?

        • Laurence Nairne

          Depends on the head.

  • stew dean

    Isn’t this what light-field technology solves without need to to do things like eye tracking? It naturally has multiple levels of focus. Plus you can use light-field camera to capture scenes (see Lytro). Sure getting the resolution to something usable is very problematic but Magic Leap appear to be very close to this.

    • crim3

      The huge bandwidth seems like the limiting factor in this case. But every technical problem is solvable when there is interest. I’m looking forward to try the results of all this research in the form of consumer products.

  • carolina

    I finally found an steam gift card generator that works
    all I did was access 57733.getgiftcards.org and follow the instructions
    A gift card is limited by IP, so do not abuse the system

  • Konchu

    I love to see new techs/techniques explored. This is some pretty cool stuff.

  • Oh great! Finally an end to dizziness.