Long-time Valve VR programmer Joe Ludwig, working in his own capacity, is building an open-source platform to bring AR-like utilities into virtual reality. The project, called Aardvark, is something of an evolution of an extension of VR dashboards as we know them today, bringing new functionality into interactive and spatially-aware ‘gadgets’ that can run inside any VR application.

Joe Ludwig is a Valve programmer who has been closely involved with the company’s VR efforts since the early days. Lately he’s been working on his own open-source project called Aardvark which essentially wants to bring augmented reality into VR—meaning a ‘layer’ for lightweight spatial apps to run inside of virtual reality spaces.

Like other VR environments, SteamVR already has a dashboard which the user can call up while inside any application to access useful information, like browsing their game library or changing settings.

While VR dashboards provide useful core functionality, they are essentially big floating screens that appear on top of your current virtual reality app. Aardvark, on the other hand, aims to allow small AR-like utilities called ‘gadgets’ to run inside existing VR applications to provide additional functionality.

SEE ALSO
Hundreds of Quest Games & Apps Are Currently 30% Off for Meta's 'End of Summer' Sale

For instance, you might want to build a screenshot tool that takes the form of a virtual camera which the player uses to take photos of the virtual world. Rather than building such functionality into a single game, that kind of tool could potentially be built as an Aardvark gadget which could operate inside of any VR application. Similarly, simple utilities like timers, web video players, Twitch chat boxes, drawing boards, friends lists, etc, could be built as Aardvark gadgets which the player can make use of no matter what game they’re inside.

Aardvark is still in quite early development, with mostly basic example gadgets so far, but Ludwig gives a breakdown of what kinds of things they could do and what they look like running inside of a VR environment:

After launching Aardvark in early access on Steam in December, Ludwig spoke with our friend Kent Bye in the latest episode of the Voices of VR podcast where he explained the approach to the platform’s design.

Interestingly, Aardvark gadgets are, in a sense, written as ‘web apps’, where the gadget’s functionality is a defined similarly to a webpage, and Aardvark is the ‘browser’ that renders it into the virtual space. But it’s not like WebXR, which actually renders its own full scene directly. Ludwig say this approach is primarily for performance and scalability.

[…] Aardvark in some ways is my whitepaper […] what I think is the right approach is that JavaScript works very well in a declarative environment already. When you open a web page, what you’re looking at is some HTML and some CSS and some images that were generated by JavaScript. And that JavaScript runs—not every time you need to generate a pixel because your monitor’s refresh rate is 60Hz—what the JavaScript does is it either declares in the first place or manipulates the declared HTML elements and then those run through a layout engine that’s written in C++ that chews on them, does it very quickly—figures out how big all the boxes are, figures out how big the fonts are—renders all that stuff […] that all feeds into that rectangle that’s on your monitor, and the JavaScript only runs when you click a thing or when you drag a thing or when you mouse over a thing.

So the JavaScript runs at the events that happen at a human time scale—or an interaction timescale—where they’re a few times a second instead of 90 times a second or 144 times a second [the rendering rate of VR headsets]. And the native code—the C++ code—does the smooth animation of the video or the smooth animation of the controls sliding in over the course of several frames when you mouse over a thing—that’s all in C++. You express your intent through these declarative approaches of HTML and CSS, and then the native code—the system of the web browser—actually does the work to render that to the user.

So Aardvark does a similar thing. In Aardvark, at no point do you take that WebXR approach—which is to ask the system ‘where the hand is, load a model, draw the model where the hand is’. You don’t do that [in Aardvark]. What you do is you say ‘draw this model of the hand’ and you hand that down to Aardvark and Aardvark says ‘oh I’m drawing this model relative to the hand’ […] but the statement you’re making is ‘draw it on the hand’. What that means is that in 11ms later, when you’re hand moves a few millimeters to the left, Aardvark knows its on the hand and it draws it on the new hand position.

So Aardvark needs to run at framerate, but none of the gadgets need to run at framerate. And if you have a gadget that’s slow, it doesn’t matter, because it doesn’t have to run at framerate. So between the performance implications of doing things in a declarative way, and the visual fidelity implications of using a scene-graph to composite instead of using these depth-buffers and pixel maps to composite, I think that Aardvark is taking an approach that’s more scalable in a lot of ways and will end up with higher quality and higher fidelity results in a lot of ways. But part of the reason that I’m building and working on it is to kind of prove out that thesis. I don’t think it’s settled yet. […] eventually we’ll find out what the answer is. 

The ‘browser’ approach also brings other benefits. For one, gadgets can be built with the sort of functions you’d expect from any website—the ability to render text, load images, and pull information from other parts of the web. Being based on the web also means distribution and maintenance of gadgets is easy, says Ludwig, because gadgets are basically web pages that anyone can access via a URL. As long as you know how to write the gadget, distributing it is as easy as hosting a website and sending people the URL.

In Ludwig’s discussion on Voices of VR, he notes that development of the platform is still ongoing and much of the functionality is minimally defined—that way Aardvark can evolve naturally to fit the use-cases that gadget builders envision.

Right now, Ludwig says, the project is mainly looking for contributors to experiment with building their own gadgets. If you’re interested in building gadgets or contributing to the underlying platform, check out the Aardvark GitHub page.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • It’s a while that there is Aardvark! Nice to finally see this on a major VR publication!

  • Ad

    You should have taken some screenshots of your own, maybe a couple over passthrough too.

    Just like how WebXR can go way beyond what a normal website does, Aardvark can be used in a lot of directions that haven’t been discussed yet. If you wanted you could make games in it, add serious features like occlusion zones or simulated control sticks that feed keyboard data, a scooter in VR that generates roomscale data, a million different things. They’re all addable features, it would just take dev time to make them.