The recently released VisionOS Beta 6 contains a video showing how users will scan their face to create their avatar using the Vision Pro cameras. Perhaps more interestingly, the video shows that Apple plans to use the external display for more than just showing the user’s eyes through the headset.

Probably the most unexpected thing about the Apple Vision Pro reveal is the headset’s external display. This is something that no commercial XR headset has shipped with to date. Apple calls this the EyeSight display, because its primary function is to show the wearers eyes ‘through’ the headset, so people nearby can tell if the wearer is looking at them or if they’re fully immersed and unable to see.

Image courtesy Apple

Technically, the EyeSight display isn’t actually showing the user’s real face. It’s actually projecting a view of their Vision Pro avatar (or ‘Persona’ as Apple calls them). Apple masks this fact with a stereoscopic display and some clever blurring and coloring effects to hide the limited resolution and quality of the avatar.

To generate the avatar, users will use the headset’s own cameras to capture multiple views of their face. The exact procedure was found in the files of the VisionOS Beta 6 which developers can get access to.

In the video we see a pretty quick and easy process which employs the headset’s external display as a sort of step-by-step guide through the process.

The scanning process is interesting in itself, but perhaps more interesting is the way Apple is thoughtfully using the external display to help guide user.

It seems likely that Apple will leverage the display for more than just showing the user’s eyes and guiding them through the scanning process, which opens a bunch of interesting doors.

For one, the display could be used to let the headset communicate in other ways to the user when it isn’t being worn. For instance, it could light up green to indicate an incoming FaceTime call; Or blue to tell the user that a large download has finished; or red to indicate that it’s low on battery and should be plugged in.

SEE ALSO
'Minecraft' to Drop PSVR Players Next Year, Leaving PSVR 2 Support Very Doubtful

While there’s nothing stopping Apple from literally just putting text on the display and going full Daft Punk, the company seems to be thinking of the external display as something a bit more organic and magical than a readout of how many emails are waiting for you or how many calls you missed.

Can you think of any other interesting use-cases for the headset’s external display? I’d love to hear more ideas in the comments below!

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • CrusaderCaracal

    Who gives a shit

    • Cris

      Well, you seem to give enough shit to make a fool of yourself on the internet. So there’s that.

      • CrusaderCaracal

        Shut up

  • Gannable

    For those non-vr folks, the digital world is about to take the next leap. It is absolutely brilliant that Apple is providing an external screen on these devices and is totally game changing in that you can be in a completely fabricated world inside while presenting yourself totally different externally (and yes, think about the huge percentage the eyes are in face to face communications!).
    Think about the massive opportunity for you to be in your world, the way you want it, while projecting another ‘representation’ of you that matches that of your audience on the outside or bettter yet, what YOU want to project to them- high class, derelict, or something in between. Whatever the situation, you can now change instantaneously based on the audience. Needless to say, the fact that I could sleep through another boring meeting while my avatar eyes are totally engaged with the speaker, will be priceless!
    Although the point above is totally ovelooked by the author, the article lighly touches on the fact that the Vision Pro external display is capable replacing any current phone/tablet/computer fully. This capability will absolutely make cell phones and tablets obsolete sooner than later and any dev reading this should consider the outside display an important surface (if not more important) than than the inside display as you CAN replicate any app on the outside display as you have on a mobile device- if you wanted. Apple seems to have realized that providing an easy, quick and ‘familiar’ surface to get that quick, instant information from a headset can solve one of the Achilles heels for MR, given the somewhat tedious work of putting on a headset before any experience can start. If Apple is smart, it will maximize this feature to bridge the gap between these 2 worlds (mobile flat screen vs. immersive) until we can move beyond flat screen mobile form factors and get into really good full time immersive headsets that has very low usability friction.

    • Guest

      Yeah, and some of your wicked coworkers are going to hack it to make it look like you are sleeping through the meeting when you are really paying attention. It may even play back some footage of you telling your spouse how much of an idiot your boss is!

      • Gannable

        Wow, if some of my coworkers could do that then I’m starting a new company with them!!!

    • Traph

      > the Vision Pro external display is capable replacing any current phone/tablet/computer fully
      https://uploads.disquscdn.com/images/3241489efaeeba4ddd8735088101e015ce46ce5eac96039e2b6429453ef9d272.gif

  • Bryan

    It could display a language translation for you to aid in travel outside your current language’s

  • Bryan

    It could also show others something of what you are seeing, if you wanted to share.

    On a plane it could display instructions to the flight attendant so they don’t have to interrupt you to ask what you want to drink.

    It could be used to augment some wicked Halloween costumes.

  • CoolHandMoosh

    Would be interesting as an authentication tool. External display could show a unique image or animation (like an Apple Watch or HomePod initial pairing animation), or a more simple/compatible QR code, so that the other connecting hardware (phone, laptop, 2nd headset, basically anything with a camera) can link for a shared experience, file transmission, exchanging contact details, etc. Getting up close to someone’s face for nfc or a bump is super awkward and taking out your phone when you already have suitable tech accessible feels like an avoidable step. May sound silly, but it would be no different than having an Apple Watch & Wallet interaction from your face leveraging Optic ID for external validation on approving transactions, showing tickets/passes/proof of purchase, bypassing manual password entry on locked devices, etc