Ahead of SID Display Week, researchers have published details on Google and LG’s 18 Mpixel 4.3-in 1,443-ppi 120Hz OLED display made for wide field of view VR headsets, which was teased back in March.

The design, the researchers claim in the paper, uses a white OLED with color filter structure for high density pixelization and an n‐type LTPS backplane for faster response time than mobile phone displays.

The researchers also developed a foveated pixel pipeline for the display which they say is “appropriate for virtual reality and augmented reality applications, especially mobile systems.” The researchers attached to the project are Google’s Carlin Vieri, Grace Lee, and Nikhil Balram, and LG’s Sang Hoon Jung, Joon Young Yang, Soo Young Yoon, and In Byeong Kang.

The paper maintains the human visual system (HVS) has a FOV of approximately 160 degrees horizontal by 150 degrees vertical, and an acuity of 60 pixels per degree (ppd), which translates to a resolution requirement of 9,600 × 9,000 pixels per eye. At least in the context of VR, a display with those exact specs would match up to a human’s natural ability to see.

Take a look at the specs below to see a comparison for the panel as configured vs the human’s natural visual system.

Human Visual System vs. Google/LG’s Display

Specification Upper bound As built
Pixel count (h × v) 9,600 × 9,000 4,800 × 3,840
Acuity (ppd) 60 40
Pixels per inch (ppi) 2,183 1,443
Pixel pitch (µm) 11.6 17.6
FoV (°, h × v) 160 × 150 120 × 96

 

To reduce motion blur, the 120Hz OLED is said to support short persistence illumination of up to 1.65 ms.

SEE ALSO
Pico 4 Ultra Aims to Compete with Meta in Europe & Asia Amid Suspected Quest 3S Launch

To drive the display, the paper specifically mentions the need for foveated rendering, a technique that uses eye-tracking to position the highest resolution image directly at the eye’s most photon receptor-dense section. “Foveated rendering and transport are critical elements for implementation of standalone VR HMDs using this 4.3′′ OLED display,” the researchers say.

While it’s difficult to communicate a display’s acuity on a webpage and viewed on traditional monitors, which necessarily includes a modicum of image compression, the paper also includes a few pictures of the panel in action.

Without VR optic (bare panel) – Image courtesy Google & LG

The researchers also photographed the panel with a VR optic to magnify what you might see if it were implemented in a headset. No distortion correction was applied to the displayed image, although the quality is “very good with no visible screen door effect even when viewed through high quality optics in a wide FoV HMD,” the paper maintains.

With VR optic – Image courtesy Google & LG

Considering Google and LG say this specific panel configuration, which requires foveated rendering, is “especially ideal for mobile systems,” the future for mobile VR/AR may be very bright (and clear) indeed.

Both the Japan-based JID group and China-based INT are presenting their respective VR display panels at SID Display Week, which are respectively a 1,001 PPI LCD (JDI) and a 2,228 PPI AMOLED (INT).

We expect to hear more about Google and LG’s display at today’s SID Display Week talk, which takes place on May 22nd between 11:10 AM – 12:30 PM PT. We’ll update this article accordingly. In the meantime, check out the specs below:

Google & LG New VR Display Specs

Attribute Value
Size (diagonal) 4.3″
Subpixel count
3840 × 2 (either RG or BG) × 4800
Pixel pitch 17.6 µm (1443 ppi)
Brightness
150 cd/m2 @ 20% duty
Contrast >15,000:1
Color depth 10 bits
Viewing angle² 30°(H), 15° (V)
Refresh rate 120 Hz
Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Well before the first modern XR products hit the market, Scott recognized the potential of the technology and set out to understand and document its growth. He has been professionally reporting on the space for nearly a decade as Editor at Road to VR, authoring more than 4,000 articles on the topic. Scott brings that seasoned insight to his reporting from major industry events across the globe.
  • Raphael

    Wow. Now that’s a display! No tiny 110 FOV. This will be good for my gen 2 VR. I guess it will appear in the first VR HMD within 10 years?

    • Rogue Transfer

      The actual built HMD has only a 120° × 96° per eye (see second column in table above).

      • Raphael

        That’s crappy. I retract my enthusiastic statement,

        • Adrian Meredith

          I hate to say I told you so….

          • Raphael

            I need to read more words on these articles before replying. Currently reading only two lines of text. Will increase to three or four or maybe even six.

          • Raphael

            You were right all along flappy. I shoulda listened.

        • Baldrickk

          120 is still a little better than 110. At least it isn’t a step backwards on that front.

  • impurekind

    Good stuff. Keep it coming. . . .

  • Lucidfeuer

    “9,600 × 9,000 pixels per eye” is it per focus point or ambiguated per-FOV resolution (the total resolution of all added focus points in a user FOV?). This seems really low…

    However the specs of their built, if implemented soon enough is finally a worthy upgrade for VR, especially if 120×96° is per-eye.

    • Johan Pruijs

      jep.. I asked myself the same question. Is that FOV per eye??

    • cham

      an eye need 60ppd, with 160° H per eye, you get 9600px H per eye.

      • Lucidfeuer

        I remember hearing a lead engineer from Oculus saying you’d need “64K” for the total 2*180° H FOV

  • Karol Gasiński

    This paragraph is mixing different terms:
    “To reduce motion blur, the 120Hz OLED is said to support short persistence illumination of up to 1.65 ms, which leaves plenty of room for addressing data considering comfortable VR viewing typically has an upper limit of 20ms, a.k.a. the max motion-to-photon latency commonly considered a requirement in modern VR headsets.”

    Panel persistence is time it stays lit during the frame. If it’s lit all the time, then single frame is displayed for 1000/120=8.33ms. If this one is lit for 1.65ms, it means it’s lit for 1.65/8.33= ~20% of frame time (so it flashes and goes black for global illumination panels). This is called low persistence, and those numbers need to stay below 4ms on average. Thats time after panel starts to glow, but first data needs to get to it, which probably will match it’s refresh frequency, meaning that data transfer will take 8.33ms (whole previous frame). As application needs to render at the same frequency as display (ideal case) it means that app rendering time on average will take also 8.33ms. And before rendering, you need some time to encode rendering operations with already predicted pose. Lets say you do it 2ms earlier. This means that application to photon latency is 2+8.33+8.33 = ~19ms.

    • Hi Karol, thank you for the correction and explanation. I’ve removed that bit from the article until I can better summarize what this means for the display in regards to VR. Again, thanks for keeping an eye out, and reaching out with your keen input!

    • Paul Sutton

      So how high end of a VR/Gaming machine will be needed to run these properly?

      • Karol Gasiński

        From paper it looks like this panel will be in fact used in mobile VR headset (some DayDream equivalent of Oculus Go?). You can find hints on it in the linked paper:

        “The MIPI DSI interface between the mobile SoC and FPGA was limited to 6 Gb/s (uncompressed), which implies a 250 MHz pixel clock at 24 bits/pixel. We settled on foveated transport pixel counts near 1280 × 1920/75 Hz to fit within this bandwidth limitation. ”

        So image will be rendered at ~1280×1920 (thats probably summed count of pixels in High and Low resolution areas) per eye at 75Hz and driven by mobile SoC GPU. There will be higher resolution area in the center, and lower resolution area outside. This lower resolution area will be upscaled in the panel to it’s native resolution.

        • Paul Sutton

          BTW , I’ve owned a Vive for some time, so the next gen VR devices greatly interest me. Besides deciding when it’s time to unload the Vive for an upgrade, going wireless but with room scale is something I’m looking for in the next gen VR headset.

          You’re saying this will be a stand alone with no PC to run the headset? That would be great with those specs. But either they’ved figured out a way to reasonably $$$ obtain the power needed, or this is going to be very expensive.

          Any idea of price point? Will this, and VR in general, be at E3 next month?

          • Baldrickk

            Personally, I’m hoping that the LG Ultragear (when it finally arrives) will be boasting these displays. It would seem to be a good fit (LG screen in LG HMD) and a step ahead of the Vive Pro (would we be able to call it the first proper 2nd Gen HMD?) and driving a display like that (even with foveated rendering) would seem to be more in the PC space rather than the mobile space at this time.

        • Bruce Banner

          From earlier articles, they mentioned fabricating their own custom high bandwidth IC driver, and that foveated driving logic for both VR and AR was implemented. LG also patented eye-tracking technology for their UltraGear. That could reduce GPU load by as much as 50%.

  • Rosie Haft

    Are the photos taken from the same distance away from the eye? I like OLED displays but the optics don’t seem quite right in order to be used as a near eye display. This would help to know!

  • Jonathan Pratte

    We are getting there!

  • Albert Hartman

    Is it good enough to read text? How small?

    • Paul-Simon

      It should be good enough to read text at moderate distances.
      Our actual resolution is more similar to 16K^2, which is twice the angular resolution of this – but this is pretty close, so clarity will be pretty incredible.

    • With 2x pixel per degree you sould be able to read 2x smaller text, it should be liniear. Take your HMD PPD and your minimal text that is readable, and do the math, it’s simple.
      x – your HMD PPD
      y – google HMD PPD
      z – min. readable font size in your HMD (at specific distance)
      a – min. readable font size in google HMD (at the same specific distance)

      x – y
      z – a

      a = z*y/x, thats all you neeed

  • REP

    HTC WIFE and Oculus Rigged is DEAD!

    • Andrew Jakobs

      haha, you’re funny….. they presented the displays, but that doesn’t mean they’re actually ready for production.. Let alone the needed GPU’s to drive these displays, even with foveated rendering..

  • NooYawker

    Makes me very excited for gen2 hardware.

  • Sven Viking

    “The paper maintains the human visual system (HVS) has a FOV of approximately 160 degrees horizontal…”

    Looking straight ahead, most people have peripheral vision exceeding 180 degrees (and much higher with eye movement).

    If they’re talking about the area your eyes can look towards directly, that might be about right?

    • cham

      160° “for each eye”. the journalist has summarized too much.

    • Branton Dark

      That’s not even physically possible. Human eye peripheral vision is at most 160 degrees horizontal.

      • Sven Viking

        I was talking about the stereoscopic FOV, not per-eye. Look straight ahead and wiggle your fingers about one foot from the side of your head. You’ll be able to perceive the motion while they’re further back than the position of your eyeball. This is because the eyeball sticks out from your face, and light can enter the retina at greater than a 90 degree angle.

        https://en.wikipedia.org/wiki/Peripheral_vision#/media/File:Peripheral_vision.svg

  • brubble

    Neat-O, now lets not ruin these fancy screens with some bulbous warped fuzzy lens.

  • oompah

    Thats impressive
    they need to put one extra pixel
    between the two to get the real HVS goal

  • Adrian Meredith

    I don’t know about anyone else but I can clearly see some kind of screendoor effect on that picture. If you look at the hazy part in the sun there appears to be a diamond/grid like pattern.

    • saintkamus

      Yep. Remember; they are only half way there.

    • Andrew Jakobs

      Uhm, but you’re looking at the image really zoomed in.

    • Engineer_92

      Right, because youre looking at a zoomed in photo on a flat monitor taken through an iphone…You cannot accurately judge something like that without wearing the headset

  • Amazing!

  • Haliff Roslan

    so wut happen to this, 1 year later any news?