Meta has finally released the long-awaited Passthrough Camera API for Quest, which gives developers direct access to the headset’s passthrough RGB cameras for better scene understanding—kickstarting the next generation of more immersive mixed reality experiences on Quest.

Until now, Quest’s passthrough cameras were mostly locked down, limiting what developers could do beyond Meta’s built-in functionalities. The company mentioned back at Connect in September it would eventually release Quest’s Passthrough Camera API, although it wasn’t certain when.

Now, in the Meta XR Core SDK v74, Meta has release Passthrough Camera API as a Public Experimental API, providing access to Quest 3 and Quest 3S’ forward-facing RGB cameras.

Image courtesy Roberto Coviello

Passthrough camera access will essentially help developers improve lighting and effects in their mixed reality apps, but also apply machine learning and computer vision to the camera feed for things like detailed object recognition, making mixed reality content less of a guessing game of what’s in a user’s environment.

When it was announced last year, former Meta VP of VR/AR Mark Rabkin said the release of Quest’s Passthrough API would enable “all kinds of cutting-edge MR experiences,” including things like tracked objects, AI applications, “fancy” overlays, scene understanding.

SEE ALSO
The Best Meta Quest 3 and Quest 3S Black Friday Sale for 2024

This marks the first time the API has been publicly available, although Meta has previously released early builds with select partners, including Niantic Labs, Creature, and Resolution Games—which are presenting today at GDC 2025 in a Meta talk entitled ‘Merge Realities, Multiply Wonder: Expert Guidance on Mixed Reality Development’.

Granted, as an experimental feature, developers can’t publish apps built using the Passthrough Camera API just yet, although it’s likely Meta is again taking an iterative approach to the feature’s full release.

The v74 release also includes Microgestures for intuitive thumb-based gestures (e.g., thumb taps, swipes), an Immersive Debugger so developers can view and inspect Scene Hierarchy directly within the headset, and new building blocks, such as friends matchmaking and local matchmaking.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.

Well before the first modern XR products hit the market, Scott recognized the potential of the technology and set out to understand and document its growth. He has been professionally reporting on the space for nearly a decade as Editor at Road to VR, authoring more than 4,000 articles on the topic. Scott brings that seasoned insight to his reporting from major industry events across the globe.