Matterport, a company specializing in the capture and digital reconstruction of real-world environments, has announced a $30 million Series C investment round led by Qualcomm Ventures.
The company says that the new investment brings its total funding to $56 million. With the new funds the company intends to:
Accelerate the development of mobile capture for VR and AR content (Matterport has been an early developer on Google’s Project Tango)
Expand its developer platform for web and mobile apps, including the Matterport Developer Program, also launching today
Continue to scale its operations to support strong demand for its growing professional business
Matterport’s current business uses a proprietary $4,500 camera to quickly scan a real world space, building out a 3D model and layering detailed imagery on top of it. By mapping multiple points in the same area, the company’s software can construct an accurate 3D model, allowing the users to move through them and look around in all directions. It’s a bit like Google Maps Street View on steroids for interior spaces.
Given that the result is a 3D model with photorealistic visuals, the space can be easily imported into a VR environment to immerse users in the space and allow them to explore almost as if they were there. And the company has done just that—a Matterport demo app on Gear VR offers up a selection of scenes captured with the company’s tech. At its best, it looks a lot like the high quality panoramic photos you’ll find in Oculus 360 Photos, but with the added benefit of stereoscopy and the ability to navigate throughout the scene from one capture node to the next.
Give it a try (above) and it won’t take you more than a minute to realize its potential for real-estate, hotels, museums, and much more (if someone doesn’t use this tech to make a real-world point and click adventure game, I’ll be quite upset).
With the new funding, Matterport hopes to move away from their proprietary camera and eventual deploy consumer-facing mobile capture options, though it isn’t clear if modern smartphones will be capable or if we’ll need to wait for phones with next-gen depth sensors.