X

News Bits: New Project Tango Video Demonstrates Incredible Realtime Environment Capture

Project Tango is project from the labs of Google that represents the cutting edge in realtime environment capture and modelling. The system, currently running only on dedicated hardware prototypes (codenamed ‘Peanut’), uses advanced depth and plus high resolution optical cameras to grab spatial and visual data from your environment. This data is fused with orientation and positional information to enable a spatially accurate representation of a captured environment.

A new video demonstrates just how cool this technology is in action. The user wanders the target environment, aiming the phone at  areas to capture whilst monitoring the data collated in realtime. ‘Meshing’ can be paused at any time, with the captured data available for inspection and review at any time. Once you’ve done your first pass, walk the environment again (the view adjusted using positional and orientation information) to grab any gaps in the mesh.

The video comes from Ivan Dryanovski, a Research Assistant at the CCNY Robotics Lab (New York) who is lucky enough to be working with a prototype Project Tango device. The Ph.D student has published interesting work on the use of 3D mapping techniques using micro-UAVs – which is an interesting potential use for this technology – mapping of remote environments using robotics.

The Project Tango launch video below provides a good introduction to the technology below, should you not have seen it.

Disqus Comments Loading...