VisionOS 2 is bringing a range of new development features, but some of the most significant are restricted to enterprise applications.

VisionOS 2 will bring some of the top requested development features to the headset, but Apple says its reserving some of them for enterprise applications only.

Developers that want to use the features will need ‘Enterprise’ status, which means having at least 100 employees and being accepted into the Apple Developer Enterprise Program ($300 per year).

Apple says the restriction on the new dev capabilities is to protect privacy and ensure a predictable experience for everyday users.

Here’s a breakdown of the enterprise-only development features coming to VisionOS 2, which Apple detailed in a WWDC session.

Vision Pro Camera Access

Up to this point, developers building apps for Vision Pro and VisionOS couldn’t actually ‘see’ the user’s environment through the headset’s cameras. That limits the ability for developers to create Vision Pro apps that directly detect and interact with the world around the user.

With approval from Apple, developers building Vision Pro enterprise apps can now access the headset’s camera feed. This can be used to detect things in the scene, or to stream the view for use elsewhere. This is popular for ‘see what I see’ use-cases, where a remote person can see the video feed of someone at a work site in order to give them help or instruction.

Developers could also use the headset’s camera feed with a computer vision algorithm to detect things in view. This might be used to automatically identify a part, or verify that something was repaired correctly.

Even with Apple’s blessing to use the feature, enterprise apps will need to explicitly ask the user for camera access each time it is used.

Barcode and QR Code Detection

Image courtesy Apple

Being able to use the headset’s camera feed naturally opens the door for reading QR codes and barcodes, which allow structured data to be transmitted to the headset visually.

Apple is providing a readymade system for developers to detect, track, and read barcodes using Vision Pro.

The company says this could be useful for workers to retrieve an item in a warehouse and immediately know they’ve found the right thing by looking at a barcode on the box. Or to scan a barcode to easily pull up instructions for assembling something.

Neural Engine Access

Enterprise developers will have the option to tap into Vision Pro’s neural processor to accelerate machine learning tasks. Previously developers could only access the compute resources of the headset’s CPU and GPU.

Object Tracking

Although the new Object Tracking feature is coming to VisionOS 2 more broadly, there are additional enhancements to this feature that will only be available to enterprise developers.

Object Tracking allows apps to include reference models of real-world objects (for instance, a model of a can of soda), which can be detected and tracked once they’re in view of the headset.

SEE ALSO
Vision Pro Hackathon Takes Place This Week With $25K in Cash & Prizes

Enterprise developers will have greater control over this feature, including the ability to tweak the max number of tracked objects, deciding to track only static or dynamic objects, and changing the object detection rate.

Greater Control Over Vision Pro Performance

Enterprise developers working with VisionOS 2 will have more control over the headset’s performance.

Apple explains that, out of the box, Vision Pro is designed to strike a balance between battery life, performance, and fan noise.

But some specific use-cases might need a different balance of those factors.

Enterprise developers will have the option to increase performance by sacrificing battery life and fan noise. Or perhaps stretch battery life by reducing performance, if that’s best for the given use-case.


There’s more new developer features coming to Vision Pro in VisionOS 2, but these above will be restricted to enterprise developers only.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Good to see camera access and object tracking, something I've been advocating for the last months!

    • VrHoS

      Mutual admiration society. While you're here at AWE you should suck up to US companies so you can get a job in America, and even afford a Unity license for the AVP!

    • PuiuCS

      unless you are in an enterprise, the features are just gutted and useless for most devs.