VisionOS Update Gives Devs Improved Tools for VR Cloud Streaming

2

Apple Vision Pro just got a new update that brings Foveated Streaming to the headset, essentially the same bandwidth-saving feature Valve is bringing to its upcoming Steam Frame headset.

The News

As noted by VR supply chain analyst Brad Lynch, foveated streaming has arrived on Vision Pro via the latest visionOS 26.4 beta update, which landed on February 16th.

Much like Valve’s foveated streaming solution for Steam Frame, Apple’s implementation uses Vision Pro’s eye-tracking to optimize the streamed image to serve up the highest quality at the very center of your view, according to recent Apple developer documentation.

If you have an existing virtual reality game, experience, or application built for desktop computers or a cloud server, you can stream it to Apple Vision Pro with the Foveated Streaming framework.

Foveated Streaming allows your endpoint to stream high quality content only where necessary based on information about the approximate region where the person is looking, ensuring performance.

Additionally, Apple notes that on Vision Pro, foveated streaming allows for a sort of hybrid approach to computing: you can display visionOS spatial content alongside streaming content, such as a flight simulator rendering a cockpit using RealityKit while processor-intensive landscapes are streamed from a remote computer to the device.

The key difference is the focus and implementation. Valve seems to be applying Foveated Rendering globally, meaning all Steam apps will benefit out of the box. Valve’s focus is also on local PC streaming, which is done via a direct Wi-Fi 6E connection.

Instead, Vision Pro apps and games need to be specifically integrated with Apple’s version of the technology, with Apple additionally supporting NVIDIA’s CloudXR SDK, which allows developers of existing VR apps created for desktop computers as well as cloud servers to stream to Vision Pro.

SEE ALSO
VR Modder Luke Ross Removes All Mods Following 'Cyberpunk 2077' DMCA Takedown

My Take

On the face of it, it looks like Apple is matching Valve punch-for-punch with foveated streaming, although I wouldn’t take this as Apple meaningfully looking to compete with the upcoming Steam Frame on the consumer end of things.

The $3,500 Vision Pro M5 refresh likely won’t come down in price anytime soon, and I don’t suspect Apple is trying to get a bunch of PC VR developers onboard to create consumer-facing versions of their apps that will need to be specifically integrated with Vision Pro foveated streaming.

If I were an enterprise user though, I would may be pretty interested in the new update, as this opens up one of the key features Steam Frame is bringing to the table.

Being able to push more compute-intensive apps to a headset they likely already own could stop some companies from justifying a Steam Frame(s) purchase, which Apple is all too happy to oblige—especially as the recent memory and storage crisis has seen components shoot up in price so dramatically, causing Valve to take reassess pricing and release date of Steam Frame.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.

Well before the first modern XR products hit the market, Scott recognized the potential of the technology and set out to understand and document its growth. He has been professionally reporting on the space for nearly a decade as Editor at Road to VR, authoring more than 4,000 articles on the topic. Scott brings that seasoned insight to his reporting from major industry events across the globe.
  • Christian Schildwaechter

    TL;DR: While Valve aimed for low wireless latency on Frame for existing flat or VR games, Apple is probably more concerned with making (partially) remote rendered 3.5K VR feasible. Meta/Microsoft tried something similar a few years ago.

    The slightly different approaches by Valve and Apple make sense. Frame will be a niche device running mostly software not created specifically for it, either locally or streamed from a PC/Steam Machine. It is not realistic to expect lots of developers to alter their apps esp. for Frame, so adding foveated streaming happening after the image is already fully rendered allows for improved latency in all apps.

    Of course using foveated streaming only sort of wastes performance if the full image still has to be rendered, of which depending on gaze only parts are then compressed and transferred. To also regain the wasted render performance, the app itself would have to implement foveated rendering, which combined with foveated streaming would then allow for higher quality at lower latency using the same hardware and network connection. Valve's OpenXR runtime/Frame supports both foveated rendering and streaming, and hopefully a lot of VR games will enable it. But the only part really under Valve's control is the streaming, so this is what they focused on.

    Foveated streaming on visionOS seems to be mostly about cloud streaming for increased render loads. Apple's current foveated streaming beta documentation mentions local streaming, but only explains connecting to Nvidia's CloudXR. The CloudXR runtime is basically OpenXR with additions for remote streaming, and supports both foveated rendering and streaming. The runtime is available for Windows and Linux, targeting either a native visionOS client or a generic WebXR client running in a browser on Quest, Pico, desktop etc.

    This is quite interesting, as visionOS itself doesn't support OpenXR, using Apple's proprietary RealityKit instead, or WebXR in a browser. So Apple's foveated streaming beta/CloudXR Framework basically provides a translation layer to connect Apple's API to a remote XR app using the standard OpenXR everybody else uses. If this works locally too, it could in theory allow visionOS to run existing OpenXR apps (not compiled binaries, just allowing to build them for visionOS).

    While looking into this, I stumbled onto a 2021 white paper by Tobii titled "A Lightweight Foveation Codec for VR" (xr_io/static/docs/2021-12-01-Foveation_pdf , the link on Tobii's website seems to be dead). This describes eye tracking based compression (basically foveated streaming) plus warping as an attempt to bring down (perceived) latency over remote connections, getting down to 2ms over a 250Mbit/sec line and 8km distance with several hops using a 2K HMD. They ran a small study to check if participants would be able to notice artifacts (mostly no) and emphasize how much esp. the reduced bandwidth costs could help cloud XR providers.

    The AVP is currently an extreme niche device, and there is a chance that Frame will outsell it within a year. For the time being, Apple seems to focus more on the enterprise market, and the option to mix content rendered locally with RenderKit plus remotely rendered content at very low latency using the same API would allow esp. enterprise customers to design much more complex XR apps. The current M5 AVP is the most powerful standalone HMD by far, but can in no way compete with a desktop GPU allowed to draw more than 20x the power of the whole HMD.

    A visionOS app using RenderKit could therefore run fully locally for less demanding content, and then transparently switch to partially cloud rendered content for things too compute heavy, without requiring to use a second API. And compared to Frame, where Valve has to consider the (un-)willingness of devs and therefor relies on externally added foveated streaming, enterprise apps for AVP will be mostly custom developments, so integrating RenderKit and both foveated rendering and streaming won't be a hurdle.

    This is interestingly something that Meta tried before in partnership with Microsoft, with their Azure Cloud Rendering support for Quest 2 and Quest Pro that Microsoft announced in June 2023. The required JoinXR client was made available on App Lab, which would require Meta's approval, who otherwise prohibited any XR cloud streaming clients on both the Horizon store and App Lab, forcing companies like PlutoSphere to rely on sideloading instead. I never heard of it again, and who knows what happened to Meta's own PCVR cloud streaming service Avalanche during the recent reduction of their VR activities. But foveated streaming might have be the missing part for making low enough latency cloud XR streaming to support weak mobile hardware a viable option.

  • rabs

    There is a mistake: "Valve seems to be applying Foveated Rendering globally" should be "Valve seems to be applying Foveated Streaming globally".

    Otherwise, the first I saw doing that was Varjo a while ago.