Microsoft announced it’s released a public preview of Azure Remote Rendering support for Meta Quest 2 and Quest Pro, something that promises to allow devs to render complex 3D content in the cloud and stream it to those VR headsets in real-time.

Azure Remote Rendering, which already supports desktop and the company’s AR headset HoloLens 2, notably uses a hybrid rendering approach to combine remotely rendered content with locally rendered content.

Now supporting Quest 2 and Quest Pro, developers are able to integrate Microsoft’s Azure cloud rendering capabilities to do things like view large and complex models on Quest.

Microsoft says in a developer blog post that one such developer Fracture Reality has already integrated Azure Remote Rendering into its JoinXR platform, enhancing its CAD review and workflows for engineering clients.

Image courtesy Microsoft, Fracture Reality

The JoinXR model above was said to take 3.5 minutes to upload and contains 12.6 million polygons and 8K images.

SEE ALSO
Meta CTO Confirms Work on "glasses form-factor" Mixed Reality Device

While streaming XR content from the cloud isn’t a new phenomenon—Nvidia initially released its own CloudXR integration for AWS, Microsoft Azure, and Google Cloud in 2021—Microsoft offering direct integration is a hopeful sign that the company hasn’t given up on VR, and is actively looking to bring enterprise deeper into the fold.

If you’re looking to integrate Azure’s cloud rendering tech into your project, check out Microsoft’s step-by-step guide here.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Well before the first modern XR products hit the market, Scott recognized the potential of the technology and set out to understand and document its growth. He has been professionally reporting on the space for nearly a decade as Editor at Road to VR, authoring more than 4,000 articles on the topic. Scott brings that seasoned insight to his reporting from major industry events across the globe.
  • another juan

    if they can make this work, it would not only transform the vr market, but would make ar glasses feasible much sooner than expected. big “if”, though

    • Newlot

      absolutely. i hope it will be available on quest 3 launch. could be a big boost to graphics. I’m wondering how much it could augment processing capabilities, ofc that would be limited by ones broadband speeds. i guess we rely on ISP to continue increasing internet broadband speeds, until – possibly – in the future headsets might not need to carry a strong processor like an M2 or an 8 Gen 2 with them. This would massively reduce weight and thus increase comfort and the path to mainstream VR is on.

  • Christian Schildwaechter

    I am wondering how exactly this works with/around Meta’s EULA prohibiting to publish cloud rendering clients on the Quest store or even App Lab. In theory services like PlutoSphere already allow to delegate all the rendering to a remote machine, so a user without a fast gaming PC could simply subscribe to such a service in a very similar way to e.g. Nvidia’s Geforce Now streaming service, and this way play (owned) Steam VR titles on the Quest.

    At least they could, if it this wasn’t something that Meta has forbidden for the time given, probably at least until they can offer their own PCVR streaming service. The current workarounds are publishing the client on Sidequest or installing a Virtual Desktop server on a rented cloud machine like with Shadow PC, but these are much less convenient than just using an existing remote rendering service with a client from the official store, so Meta has effectively stopped remote VR streaming as a business model.

    Azure remote rendering is without doubt VR streaming too, though it doesn’t render the whole application remote, instead it provides an API for developers to have parts of the rendering done on Microsoft servers and integrate them with locally rendered parts on the Quest. The service contract would be between the app developers and Microsoft instead of end users and Microsoft, which may be the required loop hole.

    I’ve looked into Nvidia’s very similar CloudXR rendering service in the past, which also supports Quest, though I don’t remember how exactly these issues were solved there. With Microsoft having partnered with Meta and both having announced plans for remotely streamed Office and Xbox Game Pass to be made available on Quest, I doubt that Microsoft would try to circumvent Meta’s EULA and suggest side-loading streaming apps, so I can only assume that this particular use of cloud VR rendering as a service is officially sanctioned by Meta, and that we will probably see more of it.

    Maybe even the long rumored Meta PCVR VR streaming service based on the Azure cloud. I would have expected Meta to try to implement such a service on their own server infrastructure, but a tighter integration between Meta and Microsoft in XR may be beneficial for both when being confronted with a (rather long term) threat from Apple’s Vision Pro, positioned as a productivity device, relying heavily on their existing iOS app library.

    • As someone that worked with cloud rendering on Quest… the answer is sideloading