X

Image courtesy Apple

Some of Vision Pro’s Biggest New Development Features Are Restricted to Enterprise

VisionOS 2 is bringing a range of new development features, but some of the most significant are restricted to enterprise applications.

VisionOS 2 will bring some of the top requested development features to the headset, but Apple says its reserving some of them for enterprise applications only.

Developers that want to use the features will need ‘Enterprise’ status, which means having at least 100 employees and being accepted into the Apple Developer Enterprise Program ($300 per year).

Apple says the restriction on the new dev capabilities is to protect privacy and ensure a predictable experience for everyday users.

Here’s a breakdown of the enterprise-only development features coming to VisionOS 2, which Apple detailed in a WWDC session.

Vision Pro Camera Access

Up to this point, developers building apps for Vision Pro and VisionOS couldn’t actually ‘see’ the user’s environment through the headset’s cameras. That limits the ability for developers to create Vision Pro apps that directly detect and interact with the world around the user.

With approval from Apple, developers building Vision Pro enterprise apps can now access the headset’s camera feed. This can be used to detect things in the scene, or to stream the view for use elsewhere. This is popular for ‘see what I see’ use-cases, where a remote person can see the video feed of someone at a work site in order to give them help or instruction.

Developers could also use the headset’s camera feed with a computer vision algorithm to detect things in view. This might be used to automatically identify a part, or verify that something was repaired correctly.

Even with Apple’s blessing to use the feature, enterprise apps will need to explicitly ask the user for camera access each time it is used.

Barcode and QR Code Detection

Image courtesy Apple

Being able to use the headset’s camera feed naturally opens the door for reading QR codes and barcodes, which allow structured data to be transmitted to the headset visually.

Apple is providing a readymade system for developers to detect, track, and read barcodes using Vision Pro.

The company says this could be useful for workers to retrieve an item in a warehouse and immediately know they’ve found the right thing by looking at a barcode on the box. Or to scan a barcode to easily pull up instructions for assembling something.

Neural Engine Access

Enterprise developers will have the option to tap into Vision Pro’s neural processor to accelerate machine learning tasks. Previously developers could only access the compute resources of the headset’s CPU and GPU.

Object Tracking

Although the new Object Tracking feature is coming to VisionOS 2 more broadly, there are additional enhancements to this feature that will only be available to enterprise developers.

Object Tracking allows apps to include reference models of real-world objects (for instance, a model of a can of soda), which can be detected and tracked once they’re in view of the headset.

Enterprise developers will have greater control over this feature, including the ability to tweak the max number of tracked objects, deciding to track only static or dynamic objects, and changing the object detection rate.

Greater Control Over Vision Pro Performance

Enterprise developers working with VisionOS 2 will have more control over the headset’s performance.

Apple explains that, out of the box, Vision Pro is designed to strike a balance between battery life, performance, and fan noise.

But some specific use-cases might need a different balance of those factors.

Enterprise developers will have the option to increase performance by sacrificing battery life and fan noise. Or perhaps stretch battery life by reducing performance, if that’s best for the given use-case.


There’s more new developer features coming to Vision Pro in VisionOS 2, but these above will be restricted to enterprise developers only.

Related Posts
Disqus Comments Loading...