Take a look around the web at reviews of Vision Pro and you’ll find that practically everyone agrees: Vision Pro’s display is excellent. Words like “crisp,” “smooth,” and “4K,” are commonly used when people are talking about the headset’s visuals. But what if I told you Apple’s headset actually has a lower effective resolution than Quest 3?

Editor’s Note: Words like ‘display’ and ‘resolution’ are often used as a catchall when talking about an XR headset’s visuals. In this article I’ll be using the term “resolving power” to describe the maximum possible detail that can be seen through a specific display pipeline (this encompasses both the display itself and the lens through which the display is seen).

I have to admit, from a purely perceptual standpoint, it very much feels like Vision Pro has the best looking display of any standalone XR headset to date.

And this makes sense, right? Vision Pro has an 11.7MP (3,660 × 3,200) per-eye resolution (we know this thanks to the hard work of iFixit) which means it has 2.6× the pixels of Quest 3’s 4.5MP (2,064 × 2,208) per-eye resolution.

That should mean Vision Pro has 2.6x the resolving power, right?

Despite that huge difference in raw pixels, there’s now compelling evidence that Quest 3 has greater resolving power than Vision Pro, thanks to the careful efforts of XR display expert Karl Guttag. That means the combination of Quest 3’s display and lens can resolve objectively more detail than Vision Pro can.

This can be seen in Guttag’s carefully calibrated through-the-lens photos below which show the same test image on both Vision Pro and Quest 3.

This the same test image shown at the same size in each headset, zoomed in to show the limits of the resolving power. The red circle in the AVP chart indicates the center of the foveated rendering region (the chart is zoomed in such that the entire area visible here is within the foveated zone (highest quality). | Image courtesy Karl Guttag

It’s clear when comparing the various font sizes that Quest 3 is showing somewhat sharper outlines. Even if the difference is ultimately small, this is quite incredible given how many fewer pixels it has than Vision Pro.

The lines under the labels ‘3 2 1’ are also a clear test of resolving power. As they get smaller, they require greater resolving power to appear as lines instead of just a white blob. As we can see under ‘1’ on Vision Pro, the lines are mostly indiscernible, whereas on Quest 3 you can still roughly see some aspects of the line shapes.

This is the result of each headset’s own resolving power bottleneck.

Because we can’t make out individual pixels seen through the lens, Vision Pro is clearly bottlenecked by its lenses. Sharper lenses would mean a sharper image.

On the other hand, Quest 3 is bottlenecked by its resolution. More pixels would mean a sharper image, because the lenses are sharp enough to resolve even more detail.

But There’s a Reason People Are Gawking Over Vision Pro

I don’t know anyone that thinks Quest 3’s display looks bad by any means. You just don’t hear the same gushing over it compared to Vision Pro. So what gives? Is this just Apple fanboyism at work? I genuinely don’t think so.

Although it’s a major factor, there’s much more to what we consider a ‘good looking image’ than absolute resolving power alone.

From my experience with many hours in both headsets, the single biggest factor in why Vision Pro feels like it has the better display is simply what is shown on the display.

It doesn’t matter how much resolving power your lens and display has if the content you’re putting on the display has less information present than is possible to be resolved.

SEE ALSO
XR News Bits – New Quest Games, Unexpected Updates, and the First Open Platform for Spatial Video on Vision Pro

On Vision Pro, Apple does an exceptionally good job at matching almost every piece of the underlying interface to the headset’s resolving power.

Every panel, window, icon, and letter is rendered with vector graphics which have ‘unlimited’ sharpness, allowing them to render sharply no matter how close or far you are to them. This means these graphics are almost always taking full advantage of the number of pixels available to render them.

[Click image to enlarge] No matter how close or far you get, fonts, icons, and panels are rendered optimally on Vision Pro
Additionally, default virtual environments are captured and rendered in very high fidelity, and do a consistently good job of matching the headset’s resolving power.

[Click image to enlarge] Even distant backgrounds are displayed in high resolution
In the browser, most web elements like buttons and fonts are rendered faithfully as vector objects, keeping them sharp no matter the distance or zoom level.

[Click image to enlarge] Like the rest of the interface, vector web elements render optimally at all distances and angles on Vision Pro (blurry portion in the corner shows blur outside of the foveated rendering region)
Apple also takes care to avoid situations where users could make text or other elements too small (which would make them difficult to read without more resolving power).

There’s a limit to how small you can make a windows, and if you make a window small and then move away from it (to make it even smaller) as soon as you grab it again the window will automatically adjust to its optimal size. Grabbing a window and moving it away from you also automatically scales the window at the same time, ensuring that the window becomes larger exactly in proportion to how far away you move it. This means text and other elements in the window retain their optimal size visually, even as the window becomes more distant.

Apple is able to achieve much of this extra fidelity no doubt thanks to the headset’s powerful M2 processor and the use of dynamic foveated rendering, which concentrates rendering detail only where the user is looking.

Quest 3 Isn’t Tapping It’s Visual Potential

In the case of Quest 3, the interface is rendered largely with raster (rather than vector) graphics, including fonts. This means they only have optimal sharpness at one specific distance and angle. If you get too close to them you’ll see visible pixelation, and if you get too far they will be more prone to aliasing (which makes text look a bit like it’s jagged or flickering).

[Click to enlarge] Getting close to Quest 3 interface elements shows the limits of their raster resolution. At a regular distance they look ‘fine’ but given Quest 3’s resolving power, they could be drawn sharper to reveal more detail at all distances and angles.
Default virtual environments on Quest 3 don’t come close to matching the headset’s resolving power. It’s easy to spot low resolution textures used throughout the environments (especially in the skyboxes that make up the distant background), often clashing with some textures and geometry in the same experience which actually are suitably sharp. Aliasing is also apparent throughout, making the edges of objects look jagged.

[Click to enlarge] Most of the Quest 3 environments aren’t reaching the limits of the headset’s resolving power, especially the skyboxes
In the browser, web pages are rendered into a texture and then placed onto the virtual display, effectively converting all vector web graphics into raster graphics that don’t look as sharp or scale optimally at different distances and angles from the screen.

[Click to enlarge] Like the rest of the interface, the Quest browser rasterizes the web page, introducing visible pixelation and lower quality depending upon the distance and angle. The browser does at least re-scale the webpage to increase the level of detail if the user zooms in via the browser controls.
Ultimately all of these little details add up to making an image that doesn’t look as good as it could be, and failing to match Quest 3’s raw resolving power.

More Than Sharpness

And it should be said: sharpness of the content isn’t the only thing that makes people feel that one image ‘looks better’ than another. Factors like color saturation, contrast levels, brightness, and framerate, can have a major impact. Even high quality sound can make people feel that one image is higher quality than another.

With an OLED HDR display, Vision Pro is capable of showing richer colors with a higher peak brightness, which further makes the headset’s image feel ‘better’ than what’s seen through Quest 3, even if Vision Pro has slightly less resolving power. It’s also significantly easier to actually find high quality video content that’s optimized for the headset thanks to native streaming apps like Apple TV, Disney+, and HBO Max.

For instance, I recently used Vision Pro to watch Mad Max: Fury Road on a plane, in 3D, with surround sound and HDR. Quest 3 simply doesn’t have the same easy access to high quality video content.

And though Quest 3 actually maxes out at a higher refresh rate (120Hz) than Vision Pro (100Hz), very few applications ever run at this refresh rate on the headset. Few Quest 3 apps even reach Quest 3’s 90Hz mode (while Vision Pro apps generally run at 90Hz).

SEE ALSO
These Third-party Vision Pro Controllers Just Hit Kickstarter, Promising to Unlock VR Gaming

It’s also common to see frame stutters on Quest 3 (which makes motion inside the headset look like it skips a beat). On the other hand, Vision Pro has an incredibly steady refresh rate which rarely has any stutters (if at all).

What it All Means

Ok so why does any of this matter? Well, it’s just one more example of how on-paper specs don’t always tell the full story. Human perception is way too complex to boil down the perceptual quality of a display simply to the number of pixels it has.

Additionally, Quest 3 has a ton of overhead with all of its resolving power. Meta’s next headset could have the same exact display and lens combo, but with more power (and more careful optimizations), and it could provide a significantly sharper image on average.

Update (May 14th, 2024): As it became clear that many were jumping to conclusions before actually reading the article, the headline has been tweaked to be more clear about the bottom line of the article.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Stephen Bard

    Besides these lens/display resolution tradeoffs, I believe it is pertinent to mention that both the horizontal and vertical FOVs are dramatically/claustrophobically narrower on the AVP and the Quest 3 passthrough is much brighter.

  • dojheuipccuhcdyvwh

    I’m perplexed why Meta chose to do their UI with raster graphics. It looks terrible.

    • alxslr

      Vector grahics performance cost? At least for large texts, I assume, but I can be wrong.

      • Arno van Wingerde

        That would be a good question for Christian. Personally, I cannot see this being a factor for just about any processor on the market today – and certainly not for the Q2/Q3 processor.

        • Christian Schildwaechter

          [You asked, so I’ll blame the wall of text on you instead of now spending a lot of time with trying to bring it down to a reasonable length.]

          Rendering everything as vector graphics would come with a very significant performance cost, esp. since this rendering is usually done on the CPU. Compared to that simple pixel operations on the GPU are almost free. Which is why nobody, incl. Apple, uses (mostly) live vector graphics for the UI. Instead most of the UI vector graphics is rendered to textures that get moved around/scaled/hidden on the GPU, with only occasional changes requiring re-rendering. This “caching in pixels” was already used in the late 80’s on NeXTSTEP with an UI using the vector based display postscript. Rendering Postscript on printers was computational very expensive, so DisplayPostscript used a lot of shortcuts and tricks to be usable for an UI.

          Apple swapped DisplayPostscript for Quartz (DisplayPDF) on MacOS X, massively speeded up 20 years ago in MacOS 10.2 due to shifting much of the work to now widely available GPUs. They also rely a lot on supersampling to make everything look smooth. Modern Macs all render at a much higher resolution, preferably double the physical pixel size and have the GPU scale it down to the actual screen size. The OS only gives a relative size, and apps are expected to provide e.g. very large icons that can dynamically be adapted to whatever current resolution the user selected. So from the point of view of an app, most of the graphics/UI are vector based with an theoretically infinite resolution, but the display server renders as little as possible as actual vector graphic and instead relies heavily on the speed at which GPUs can move pre-rendered textures.

          And it’s not only Apple. A couple of years ago Unity, which is used to create most XR apps, integrated the TextMesh Pro asset from a 3rd party developer, which used similar methods to replace the ugly pixel based Unity font system with a a much more pretty and powerful vector plus pre-rendering to textures approach. And at least since Oculus Go, the Oculus SDK has included an image compositor that can overlay a dedicated layer with text rendered at very high resolution over the remaining image to successfully “fake” a much higher render resolution than the hardware would be capable of in real time.

          The XR2 is of course much, much faster than a 1987 68030 running DisplayPS or a PowerPC running QuartzExtreme in 2002. Rendering a vector based UI in real time may be more feasible today than it was 20 years ago, but it would still waste a lot of resources, as we have well established ways to combine the advantages of vector based graphics with the insane speed of GPUs when handling pixels, in a way that that neither users nor developers (barely) notice the difference to a pure vector based system.

          The problem is often changing existing systems. AVP inherited Quartz from MacOS, which inherited DPS from NeXTSTEP, which started from scratch, but this now gives all Apple platforms a (seemingly) vector based rendering system by default. Windows introduced something similar with Vista, but kept compatibility to previous pixel based development, which made it much harder to e.g. properly support high resolution displays, something that Linux with ancient rendering systems still struggles with. And unfortunately most XR developers just use the text rendering in Unity (improved thanks to TextMesh Pro), instead of using the high resolution text layer of the compositor on Quest, which would achieve a much better text display without having to massively crank up the overall rendering resolution at high GPU cost.

      • Ben Lang

        Yes almost certainly there is performance overhead. Also Quest’s origin mostly as a gaming device means much of it is build with game-like tools and workflows, which mostly means 3D engines and rasterized textures.

  • wheeler

    My own experience is that while the quest 3 may have better resolving power than the Vision Pro, *what* is being resolved–that is, the quest 3 display itself–just looks awful in comparison: pixelation, screen door effect, mura, washed out LCD contrast, and poor colors. And there are other problems too, for example the image is by comparison very swimmy/unstable as distortion is not corrected through the eye tracking, and the pixelation is amplified as it is diagonally oriented (whereas most things you actually interact with are horizontally or vertically oriented).

    I do think you’re right about Apple doing a lot with the content that’s being displayed, but it’s also just the display itself. The numerous problems with low pixel density counter-rotated LCD displays are actually amplified by the quest 3’s optical resolving power.

    Another interesting question is whether or not one would *want* to see the pixels on the Vision Pro display and whether or not the blurring is intentional (as some VR engineers have stated). In my experience the Vision Pro is the only headset where I can almost completely forget that everything I’m looking at is being mediated through screens, but if I could once again resolve individual pixels I may not feel that way anymore.

  • Dragon Marble

    This is obviously crap. Especially for Ben, who had seen through the Vision Pro lens, and yet chose to believe someone else’s pictures rather than his own eyes, then came up with elaborate explanations rather than question the techniques used Guttag’s analysis.

    Just try to work in both headsets. Adjust the window size to the same, and find the smallest font you can read on each headset. You’ll see it has nothing to do with content, and there is no debate to which one has higher resolution.

    Through-the-lens photos are notoriously bad at judging headsets. How do you even take such pictures when the Vision Pro requires eye tracking to function properly. It won’t even work for a another person’s eyes, let alone the cameras.

    • Ben Lang

      The whole point of the article is that the through the lens photos accurately reveal Quest 3’s maximum resolving power, but that the software scarcely takes advantage of it. Vector graphics used throughout AVP provide optimal font rendering at all distances and angles, which makes a big difference compared to raster font rendering.

      • Dragon Marble

        OK, maybe that’s true for Q3. But the comparison with AVP is not valid.

        I don’t trust any through the lens photos for AVP. It constantly requires eye-tracked foveated rendering. If someone else borrows your headset without redoing eye calibration it could be a painful, unusable experience. Sometimes rolling my eyes and blinking at the same time is enough to cause the visuals to go haywire.

        The bottom line is: There are text too small to read on Q3 but perfectly fine on AVP. AVP has much higher resolution than Q3 period. I haven’t heard anyone say they can’t work in AVP because of resolution. Guttag’s techniques applied to XR is a perfect example of bad science.

        • Ben Lang

          I normally don’t trust through-the-lens photos because they are never calibrated. These photos are calibrated. It’s understandable if you missed it but I explained that the red circle in the test chart photos show the eye position / center of the foveated rendering (this is an option you can enable in AVP accessibility settings).

          • Dragon Marble

            There are still many unknowns. Does it even work as expected when the headset can’t detect any eyes? The lenses are optimized for our eyes with tiny pupils. When you put another set of lenses in from of them, pupil swim may translate to blurriness.

            OK, I am not an expert and I don’t want to speculate. But I don’t need to. I have both headsets. I can simply put the same sized (virtual) text on my wall and confirm that I read further away on the Vision Pro.

          • Teku

            I don’t understand what point is being made. To my eyes, the Q3 image looks more pixelated, jaggedy, and pixel-doorish, and the AVP looks a little fuzzier, but smoother and cleaner. Even with these through the lens photos, I’d much rather read a load of text that looks like the AVP example than the Q3 example.

          • Arashi

            It doesnt even look ‘fuzzier’. There’s no way I’m going to watch a movie on my Q3, doesn’t look good. On my AVP it looks as good as my super high-end TV which is quite incredible. Same for sharing a desktop, no way I’m going to work in my Q3, it looks way too fuzzy, unsharp, jagged etc. On my AVP it looks pretty much as good as on my monitor. Not sure who put all this stuff into Ben’s head but it’s really plain nonsense.

          • Ben Lang

            The point:

            Quest 3’s lens/display can objectively show more detail. We know this because in the test chart we can see the lines under the ‘1’: For Quest 3 we can slightly see the orientation of the lines, but for AVP we pretty much just see a blur. This is because AVP’s lenses do not have enough clarity to discenere between lines that are that close together. You could double AVP’s resolution and it wouldn’t make much difference, because it’s resolving power is limited by its lens.

            So while it’s technically true that Quest 3 has slightly greater resolving power, Meta is unable to capitalize on this for various reasons relating to procesing power and software optimization. On the other hand, Apple does a much better job of making maximum use of its resolving power, so it generally shows a superior image in real use.

          • Christian Schildwaechter

            The flaw is assuming that human vision also works “objectively”, which it absolutely doesn’t. The sharpness in the still-image is bough with aliasing artifacts during head movement, which make recognizing objects harder for humans. We use anti-aliasing in games precisely because a fuzzier image is less irritating than pixels visibly jumping between frames due to low resolution. AA “objectively” reduces resolution, but the result looks a lot less disturbed, esp. in motion, which we perceive as better/higher resolution. So “sharp” is great for a static 2D display, not so great for a dynamic 2D display, and potentially distracting on a low PPD HMD where flickering edges mess with object/edge recognition.

            You could turn the argument around: Meta has to go with the sharper images on Quest 3 because they don’t have enough pixels that blurring away the pixel edges like on AVP would be acceptable. And you won’t realize this by only looking at a static frame, as what we see in the HMD is constantly shifting due to minimal head movements. AVP looking better has a lot more to do with how human vision works than with how Apple renders the UI.

        • Christian Schildwaechter

          Guttag’s techniques are fine, it’s just that our vision doesn’t work like a camera, and an image that looks worse in “pixels” may look better to use. Which is what happens on AVP. The blur seems to be caused by the image being a bit out of focus through deliberate lens placement, to remove any visible pixelation. This will look less sharp for a camera, which only records pixels, but look more natural to us. “Blurry”, anti-aliased text is easier to read when text isn’t rendered perfectly aligned with the display’s pixels, something pretty much impossible in XR due to head tilt. We don’t read by detecting pixels, instead the brain recognizes the shape of words and letters, which is more consistent on anti-aliased text, as we don’t expect moving jagged edges on things.

          Guttag’s images are probably correct in that a still frame image looks sharper on Quest 3. Only AVP’s reduced sharpness isn’t an accident, but instead a clever way to make use of our vision automatically adding missing details/resolution to recognized objects. So while technically the AVP image is blurrier, this aides with recognition, making things easier to read, and is perceived as higher resolution. Because eyes aren’t cameras.

          • Dragon Marble

            Human perception is an important part — which already makes his analysis pointless — but through the lens photos are also objectively unreliable representations of what we see in the first place.

            Headset lenses are designed to work together with our eyeballs. If you swap them out for another set of lenses, there will be some loss of quality. In this case the AVP lenses may be more sensitive to this degradation than the Q3 lenses, making the comparison invalid.

      • Arno van Wingerde

        Yes, but that message gets buried under a provocative headline. I have not tried the AVP myself, but I feel convinced it is better – although part of that is just Apple’s marketing. However, I have often wondered why Meta does so little to improve the graphics.

        For instance, most home environments, the first thing the user sees, are mostly incredibly low-res to me, the difference to the AVP backgrounds is enormous and as anyone who watched movies on Quest3 can testify that they are so much lower then the Quest3 hardware can produce.

        Meta often seems to do its utmost to piss of the users. I certainly hope they learn from Apple as Apple is learning other stuff from Meta…

        • Stephen Bard

          The annoying low-resolution of the Quest home environments is completely unnecessary. If you open the Brink Traveller app on your Quest 3 and press the right home button, you can then fully use your Browser or other windows with these incredibly beautiful hi-res backgrounds (but only one window at a time).

          • kraeuterbutter

            and even brink Traveller app: uses low resolution..
            try game optimizer with brink traveller app –> you can pump up that resolution still alot making it look even more amazing !

          • kraeuterbutter

            Brink Traveller before the update:
            1440 x 1584 renderresolution (like on Quest2)

            after the update for Quest3: 2016 x 2112
            so about 35% more pixels..

            but: you can crank it up to around 3000x3000pixel with quest game opitizimer
            so nearly 100% more pixels.. and it stills runs smooth enough but shows realy nice picture than

        • kraeuterbutter

          360 Panos with the quest3 browser – how most people would at least at first look at them – have much lower resolution than using other apps..
          another example that they do not use the hardware…

          most games, existing games from quest2 area: you can put a lot of superresolution on them – no option for that..
          most games still look blurry in the quest3 and you need a tool like quest games optimizer or sidequest to unlock the real potential of this headset

    • Arashi

      This article indeed makes no sense. My AVP is WAY sharper than my Quest 3. They’re not even close

    • Arno van Wingerde

      What you might try is create some text lines, make a picture out of that and then compare views of that picture in both head sets. That way, you eliminate the Apple vector graphics and compare apples to Apples.

      • kraeuterbutter

        you can do that.. but… why ?

        when people using both headsets say: they can read text better, it looks everything sharper to them, they cant see screendoor effect..
        its the best expierence in picture quality of any headset they ever tried..
        whats the point for them to find a artificial setup to make it look worse in comparison?

        for most people, with most software and most situations it seems: they think the avp is sharper..

        at the end, that is what is counting

        • Christian Schildwaechter

          It would still be interesting to know how much the different factors impact perceived sharpness. So far the candidates are display resolution, pixel separation and vector graphics/super sampling, all of which will have some influence. So if you create a situation where you only compare resolution or blur or rendering, it becomes easier to determine the cost/benefit of each.

          If vector graphics have a huge impact, it may be worth it to throw more performance there even on lower powered HMDs. If blur turns out to look worse in through-the-lens images, but actually make the images seem clearer due to our perception just assuming it must be sharp due to indiscernible pixels, then placing lenses slightly out of focus could improve things at basically zero cost. If in the end all that really counts is the number of pixels, it might be best to focus on better displays first instead of other features.

          I assume that both Meta and Apple (and Samsung and Google and Qualcomm and …) have tested this extensively and therefore already know, but it would still be nice to understand why exactly they went with different configurations.

  • Kevin Brook

    I’ve not used an AVP but I used to have a 2448 x 2448 per eye Vive Pro 2 and the Quest Pro with 1800 x 1920 per eye looks dramatically better. Even using the eye chart in the Steam environment I could read text more clearly in the Quest Pro than the Vive Pro 2. I put it down to the Vive’s terrible lenses.

    • Guest

      Yes, the Quest pancake lenses are superior. Years ago, Zuckerberg bought the patent for that very pancake lens design from a little company called eMagin, a microOLED manufacturing company that Samsung would acquire years later. They made these screens for the US military for decades, being used in jets and the like. It would’ve been nice if eMagin still had that patent so Samsung could use it for their own XR headset. It would make for stronger competition.

      • Andrew Jakobs

        Uhm, Meta isn’t the only one using pancake lenses, it doesn’t have the patents on that one.

        • Guest

          Not all pancake lenses are the same. If you try the Pico 4 and then the Quest 3, you will understand what I mean.

          • XRC

            This is true of all lenses regardless of type, if you examine Fresnel, aspherical and pancake from different manufacturers you will be surprised at differences.

    • Smanny

      Look at the end of the day no one goes around saying that 4k tv has 4 times the number of pixels as a 1920x1080p display. We say that a 4k resolution is double the resolution of a 1080p display. So the Vision Pro resolution is 74.5% more than the Quest 3 in width (x), and the Vision Pro resolution is 45% more than the Quest 3 in height (y). This makes more sense, and it is easier to take in and see the physical difference.

    • Mike

      Vive Pro 2 was a bad product. The stereo overlap was so low it was unusable for me.

      • kraeuterbutter

        i own(ed) 11 Headsets so far..
        the Vive Pro 2 was the only one i had to return.. i could not use it..

  • eadVrim

    I didn’t try AVP but for me the new Quest 3 lenses was a game changer in VR compared to all previous VR headset since the DK1

    • Ben Lang

      Yup, they’re excellent, and clearly sharper than AVP’s lenses!

      • eadVrim

        And cause these lenses are flat, the 3D depth look more natural in the whole field of view.

        • Ben Lang

          Hmm not sure what you mean?

          • Smanny

            Ben saying that there is around 2.6 times more pixels is not showing the real picture. Especially since the resolution on the Vision Pro is 3660×3200 per eye. It is easier to visualize, and understand that the Vision Pro has 74.5% more pixels in width (x), and 45% more pixels in height (y), over the Quest 3. Think of it this way, we don’t go around saying that a 4k TV has 4 times the number of pixels over a 1080p display. We say that a 4k TV has double the resolution over a 1080p display.

      • Игорь

        From what I know, AVP is intentionally slightly out of focus to mask screen door effect which is… not a bad approach, especially with OLED (OLED has smaller subpixels than LCD making gaps between them more visible even with higher resolution). Plus photo in the article is more blurry than AVP actually is, though not by a lot.

        But generally the resolution difference is not as noticeable as I thought it would be, HDR OLED image quality is a bigger factor for me here.

        That said, Q3 is still a very impressive device at 1/4th the price ($4K for AVP and around $1000 for 512GB Q3 with elite battery headband and controllers that actually work).

      • Christian Schildwaechter

        Others concluded that sharpness on AVP was reduced deliberate by placing the lenses outside of their focus length from the displays, causing a slightly blurry image. This makes single pixels indiscernible and hides any screen door effect. The “less sharp” lens setup is basically optical anti-aliasing to improve the viewing experience, with the (expensive) displays allowing for still enough effective resolution after being artificially blurred.

        I somewhat doubt Apple would go for USD 700 displays, but then cheapen out with subpar lenses. Smoothening the image with blur at the cost of waisted resolution would also be a very Apple thing to do. Any (non-retina) display will cause aliasing effects, with multiple approaches to counter these, all at the cost of sharpness. It’s a design choice, with some like Karl Guttag preferring sharpness, which works better with e.g. static text, while others prefer a smoother image that looks better with moving objects. So the Quest 3 lenses may not be sharper themselves, but instead installed in a way that prioritizes sharpness to improve readability at the lower resolution.

    • STL

      PSVR2 looked better to me, but I abandoned it for lack of good games.

      • david vincent

        PSVR2 has display port, nothing beat the clarity brought by DP

    • Andrew Jakobs

      I guess you hadn’t tried the Pico 4 then, as that already has excellent lenses before the Q3.

      • kraeuterbutter

        it was – for me – a milestone, the Pico4-lenses (FOV, weight, formfactor, e2ec) compared to fresnel-lenses..
        BUT: the Quest3 has better lenses, less reflection, less glare and: better distortion-Profiles

      • eadVrim

        Yes, I was only on PC VR headsets, Quest 3 is my first standalone headset.

  • guest

    Like those I/O pics are troll bait to try getting leaks on Samsung’s new devices!

    • perVRt

      I will bite. Hint: The pen is mightier…

  • Apple

    What about Google cardboard?

    It uses your phone? Too simple

    Yeah Google cardboard is like a simple version then people will Switch to another service

    Please tell me

    • ViRGiN

      Google Cardboard has never been a problem. The problem was calling “google cardboard” any smartphone fitting into random plastic case mimicking VR headset, while the biggest issues have always been lack of proper lens distortion.

      Gear VR was an excellent, tailored “google cardboard” alike experience.

      • Ardra Diva

        I loved GearVR. Samsung pumped a massive amount of content out, the app store dwarfs that of Oculus/Meta. But, they only really tried for 3 years then… died.

  • JB1968

    This article is full of bullshit. Quest3 will always look bad because their crappy mobile chip can’t handle native resolution of the displays even for 2D UI in vector graphics. Especially if they use some shitty Android based gfx framework for their “os”. And don’t let me start about the washed out lcd display quality, my eyes are bleeding…

    • Hussain X

      “Quest3 will always look bad” but it can be freely powered with a 4090 too. Plus significantly expensive AVP doesn’t offer 7x the benefits of the Quest 3.

  • Stooovie

    AVP optics are specifically tuned to ve slightly put of focus to avoid the screen door effect. Boom, discussion over.

    • Andrew Jakobs

      You have proof of that claim? As with the much higher resolution displays Apple is using, SD wouldn’t even be that much of a problem anymore. We’re not talking about the SD of the original Vive or Oculus Rift, even the Quest 3 already has almost non existing SD compared to those. Yeah you still see it if you really start looking for it, but on normal use you don’t really see it.

  • game-tea

    Isn’t this because the AVP’s lenses are ever so slightly out of focus to mitigate screendoor? Also, I must say the terrible contrast of the quests lcd’s is significantly more annoying than slightly lower “effective resolution” especially when it completely breaks immersion in dark parts of HL:Alyx because you can’t see shit.

    • Ben Lang

      It’s unclear if the ‘out of focus’ is intentional or not, but it’s true that it hides aliasing/screen door.

      • Christian Schildwaechter

        Do you really believe that Apple created an USD 3500 HMD based on insanely expensive hardware, accidentally positioned the lenses in a way that the image is always slightly out of focus, and then never realized this during a decade of development? If the image is slightly blurry, it is because Apple wanted it to be slightly blurry for a specific reason.

        • Ben Lang

          Indeed that may be the case, but it’s such a curious decision that I’d want to hear the claim from someone who was actually involved rather than just inferring it.

          • Christian Schildwaechter

            While this is a reasonable approach in general, I seriously doubt you’ll ever get that, based on Apple’s history regarding explaining their internal design decisions. If telling others about a particular feature isn’t directly usable to promote the platform, we are usually lucky if it even makes it to the specs.

            A question like “Why we made AVP’s image look blurrier than the competition” wouldn’t even be acknowledged, let alone answered, as it implies that Apple might have done something wrong. Apple’s decisions are always right and in the best interest of their users, until the day they make a 180° turn, tell everybody about this revolutionary new thing they have come up and just never mention the past again. Oceania has always been at war with Eastasia.

          • Arno van Wingerde

            Oh Yee of little faith! You realize that be uttering such heresies you will never go to Apple heaven?

          • Christian Schildwaechter

            Damn! I was hoping that if I’d just complain loud enough, they’d instead try to bribe me over to their dark side, and I wouldn’t have to pay those unholy RAM and flash prices anymore. I’ll eventually have to fully switch to Linux to evade both getting burned at the stake and getting burned by Apple’s upgrade pricing.

          • kraeuterbutter

            well.. Samsung was promoting there “Anti-Screendoor-filter” on the Odyssey+
            — which made the picture less sharp, more soft

        • Guldhammer_DK

          Nice to get some short answered logic here…thank you.

  • Arashi

    This really is the worst article I’ve ever seen on this website. Zuck should first upgrade the resolution so I won’t be staring at pixels anymore. Then put OLED into the headset so that the colors won’t look dull anymore. And THEN we can talk about comparions which might or might not make sense. But right now I’m staring at dull colors with clear pixels visible so any comparison to the AVP is just really laughable.

    • Andrew Jakobs

      Sorry, but OLED over resolution for me, and bigger FOV over resolution. And let’s not forget, AVP is 7 times as expensive as the Q3 but it sure as hell isn’t coming close to being 7 times better, it’s hardly 2 times better. So the Q3 is actually doing a much better job for the price it’s sold as the AVP. But leaving out the price, yeah, the AVP is better, but taking price into account it is far from better, hell the AVP doesn’t even come with controllers for actual VR games and the handtracking isn’t even that good.

      • Arashi

        But that’s a whole different discussion. This article is about some guy wanting me to believe that the AVP has a “higher effective resolution”. It’s just big time nonsense.

        • Christian Schildwaechter

          It all depends on how you define “better” or “effective resolution”. If you are ONLY locking at sharpness in static images, AVP intentionally blurring pixels means you’d need a much higher resolution to get it sharp again, as you have to widen lines/edges until the blurred part becomes insignificantly. Extreme example:

          – On a static display, alternating black and white pixel columns creates eight very sharp lines. [bwbwbwbw]
          – Anti-aliasing/blur would give you just gray at the same column pixel width [gggggggg]
          – Doubling the resolution/column width results in gray with thin black and white stripes [bgwgbgwg].
          – Only once you quadruple the pixels you are getting distinguishable edges again, but now need 32 pixels for eight somewhat sharp lines [bbbgwwwg].

          AVP blurrs via optics, so this doesn’t translate 1:1, but looking only at this one characteristic (sharpness in still images), the much higher resolution in AVP isn’t sufficient to create a similary sharp image as Quest 3. But any comparison generalizing cherry picked features is pointless by default. As is claiming something is “better” without first naming the implied use case.

      • Ardra Diva

        How is it even “2x better”?

  • d0x360

    OLED is the only display tech available that should be used in VR. Instant pixel changes, blacks, no ghosting… LCD is garbage

    • ViRGiN

      I agree. Even the “worst” oled used in vr headsets, no matter the trade offs, was much better than any LCD.

      • Dawid

        Is PCVR dead?

        • ViRGiN

          Always has been.

    • Andrew Jakobs

      No ghosting? OLED has even worse ghosting/smearing than LCD. The only real thing I can give you is the blacks (and color reproduction), but that is one of the most important thing. I still love/use my HTC Vive Pro over my Pico 4, although the Pico is much sharper and much more clear, I do like the deep blacks of the Pro (ok, and I like the vive wands and index controllers much more then the pico controllers).

    • Yan Briot

      Black smearing… Do u know that?

  • As others have said, Apple has put on purpose the lenses out of focus to reduce any screen door effect. The AVP has a higher resolution, so can potentially show much more details… but they are limiting it for a technical choice.

    Interesting analysis btw

    • Andrew Jakobs

      I doubt if Apple really put the lenses out of focus due to SD, as the Q3 already hasn’t overly visible SD (unless you really look for it), with the displays of the AVP, SD would even be much less visible if you really look for it. I think it really just is a limitation of the lenses used by Apple.

      • Christian Schildwaechter

        SD refers to seeing dark lines between pixels, which goes away with higher resolution. But you still recognize the individual pixels much larger than the space between them, and this becomes more noticeable if an object slightly moves and single pixels on the edge get turned on or off.

        Out of focus lenses would help with SD, though AVP’s high resolution should hide it even with proper focus. But the lens configuration also blurs the edges of individual pixels, making them appear rounder, more connected and less sharp. Which reduces edge pixels noticeable being switched on and off during motion, or the grid of squares structure becoming obvious during rotation.

        In the static through-the-lens image, the circle on Quest 3 shows a lot more pixel stairs and the white background is visibly build from single pixels. The circle on AVP looks rounder and the background more homogenous due to the pixels being blurred into each others, at the price of fuzzy edges. Motion would emphasize this even more.

        So it’s probably not technical limitation, but a design feature. Just not for reducing SD, but pixel distinction.

  • ToreTorell

    Thanks for a very interesting article Ben. Based on the info here, shouldn’t it be possible to create a small demo PC VR app that actually looks crisper/sharper on the Quest 3 than a similar app on Vision Pro?

  • kraeuterbutter

    Is there any chance that:
    a) the pictures were not perfectly in focus?
    b) there are different AVPs out there, with potentially varying lens or mounting quality?

    I’ve noticed that Karl Gutag is considered an expert in through-the-lens imagery.

    I’ve seen through-the-lens videos on YouTube from VR YouTubers, and often the errors in taking the pictures are more significant than the actual differences between the headsets. As a result, sometimes a headset with higher resolution shows less detail than another, and in the next comparison, it’s the other way around.

    so: this should be not the case with Karls tests (?)

    And regarding point b), how many AVPs were tested?

  • impurekind

    Just tells me the Meta really need to do better across the board then. Get all those UI elements in vector graphics to scale properly. And definitely get all those awesome stereoscopic 3D movies on the headset too, for everyone to easily access and watch. I STILL can’t watch The Super Mario Bros. Movie in 3D on my blimmin Quest 3!

    • Ardra Diva

      Can’t imagine why not. It’s great for 3D movies and video. I watched some last night even, and also Youtube app has lots of 3D content.

      • impurekind

        Yeah, I honestly have no idea why it isn’t exceedingly simple to get that movie in stereoscopic 3D for my Quest 3.

  • Apple’s headset does not look better, it looks silly. As usual, they are late to the market with overpriced, overhyped junk. Spend x5 to x10 as much as their establish competitors, who have better products, and pretend you’ve joined some sort of elite society of superior humans, right out of a 1970’s Scifi movie. It’s the same old scam they’ve always been pulling on the weak minded and pretentious for 40 years.

    More Apple fanboy-ism. Pretend you didn’t waste your money. Sunk cost fallacy.

    And this website is getting reduced to an Apple fanboy page. It’s nice that you’d point out something anybody else could have told you (and somebody else DID tell you, with proof). Resolution is one of DOZENS of ways the Quest 3 is superior. Just one.

  • Ardra Diva

    Meta’s continual software updates for Quest 3 have greatly improved its passthru cameras. You could already read a text or newspaper while wearing them, but the stability now is significantly better. the AR/MR content is jawdropping.

  • WilliamTellit

    I’ve owned the vision pro since launch, and I also own four oculus headsets, and compared to the AVP, They are all toys. I respect Meta and oculus for what they’ve done, and I appreciate their technologies, but again compared to the Apple vision pro, they are toys. To have, several extremely well anchored windows placed in different rooms is something that has to be experienced. Once serious developers begin working with, and releasing more content, this will truly shine. You have to remember that when the iPhone came out, there was no pinch gestures, App Store, or even easy way to put music on it. Technological evolution takes time. I’m having a blast with this thing every day!

  • Niels

    Who else read the "ri" in Arial as an "n" in the detail resolving image?
    Around the third smallest "Arial" i realized it isn't what i thought.

  • Spaz

    Anyone else seeing the word "Arial" change from 11 points or smaller on the Quest 3 sample image ;)