ARCore, Google’s developer platform for building augmented reality experiences for mobile devices, just got an update that brings the company’s previously announced Depth API to Android and Unity developers. Depth API not only lets mobile devices create depth maps using a single RGB camera, but also aims to make the AR experience more natural, as virtual imagery is more realistically placed in the world.

Update (June 25th, 2020): Google today announced it’s making its Depth API for ARCore available to developers. A few studios have already integrated Depth API into their apps to create more convincing occlusion, such as Illumix’s Five Nights at Freddy’s AR: Special Delivery game, which lets enemies hide behind your real-world objects for more startling jump scares.

ARCore 1.18 for Android and Unity, including AR Foundation, is rolling out to what Google calls “hundreds of millions of compatible Android devices,” although there’s no clear list of which devices are supported just yet.

Original Article (December 9th, 2019): Shahram Izadi, Director of Research and Engineering at Google, says in a blog post the new Depth API now enables occlusion for mobile AR applications, and also the chance of creating more realistic physics and surface interactions.

To demonstrate, Google created a number of demos to shows off the full set of capabilities the new Depth API brings to ARCore. Keep an eye on the virtual objects as they’re accurately occluded by physical barriers.

“The ARCore Depth API allows developers to use our depth-from-motion algorithms to create a depth map using a single RGB camera,” Izadi says. “The depth map is created by taking multiple images from different angles and comparing them as you move your phone to estimate the distance to every pixel.”

Full-fledged AR headsets typically use multiple depth sensors to create depth maps like this, which Google says was created on device with a single sensors. Here, red indicates areas that closer, while blue is for farther areas:

 

“One important application for depth is occlusion: the ability for digital objects to accurately appear in front of or behind real world objects,” Izadi explains. “Occlusion helps digital objects feel as if they are actually in your space by blending them with the scene. We will begin making occlusion available in Scene Viewer, the developer tool that powers AR in Search, to an initial set of over 200 million ARCore-enabled Android devices today.”

SEE ALSO
'Ember Souls' Brings 'Prince of Persia' Vibes & Hack-and-Slash Action to Quest & Steam This Fall

Additionally, Izadi says Depth API does’t require specialized cameras and sensors, and that with the addition of time-of-flight (ToF) sensors to future mobile devices, ARCore’s depth mapping capabilities could eventually allow for virtual objects to occlude behind moving, physical objects.

The new Depth API follows Google’s release of its ‘Environmental HDR’ tool back at Google I/O in May, which brought more realistic lighting to AR objects and scenes, something which aims at enhancing immersion with more realistic reflections, shadows, and lighting.

Update (12:10): In a previous version of this article, it was claimed that Google was releasing Depth API today, however the company is only now putting out a form for developers interested in using the tool. You can sign up here.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Well before the first modern XR products hit the market, Scott recognized the potential of the technology and set out to understand and document its growth. He has been professionally reporting on the space for nearly a decade as Editor at Road to VR, authoring more than 4,000 articles on the topic. Scott brings that seasoned insight to his reporting from major industry events across the globe.
  • Xron

    Seems awesome, now we just need better depth sensors on more phones.

  • Adil H

    I hope it to be a first step to the possibility of saving virtual object in the real world.

    • Angila

      A tension free work place and boss who rewards you for all your hardwork and you obtain good enough time to invest with your family members. It definitely sounds like a dream. Well, It is possible to acquire all the things above. We bring to you an internet based venture that operates towards your growth. This on line work can provide you flexible time. It is possible to get the job done from anywhere. Get around $17000 per week by just dedicating handful of hours daily. Look at this amazing opportunity and transform your life completely >>> aburs.ir/95cfb1

  • Oh wow, amazing!

  • Jack H

    I haven’t come across such a paper yet, but it I think using the phase-based focus on some phones should be able to help get the real-world scale and distance to objects i.e. as metres, instead of arbitrary units approximating metres.

  • duck

    witty

  • GigaSora

    I wonder why the Pokemon Go team is developing their AR solution in house instead of using this. It’s made with Unity, right?

  • Rupert Jung

    Most AR apps on my Samsung S9 aren’t even able to detect the floor. Most time I give up about 30 seconds.

  • Would love to see a simple mesh capture tool using this.