Showcased at Apple’s Worldwide Developer Conference this week, ARKit is a new core technology for iOS 11, due to launch this Fall, soon to enable augmented reality features on hundreds of millions of iPhones and iPads. As the iOS 11 developer beta is already available, we’re starting to see some interesting real-world tests of ARKit, showing off the tracking that’s achievable with nothing more than a camera.
Developer Cody Brown hacked together a quick demo using Overwatch assets as a ‘hello world’ test of ARKit running on an iPhone 6S:
Apple’s keynote included a couple of impressive live demonstrations of screen-based AR on the stage, including a sneak peek at an Unreal-powered experience from developer Wingnut AR:
Perhaps the most impressive aspect of that demo was that it was running on an iPad, using the single camera on the back of the device for tracking. This appeared to deliver fairly stable tracking, without the need for dedicated hardware, unlike Google Tango, which uses a suite of cameras and sensors.
Now that developers have their hands on ARKit, the early real-world tests are very promising, such as this clip of the Unity sample demo showing tracking points and plane estimation:
This video from Austrian augmented reality company ViewAR puts the technology through a demanding tracking test, covering the camera, moving quickly away from the virtual object, through multiple rooms with different lighting conditions to check for drift. The result is remarkable considering the limitations of using a single camera:
Apple is believed to be hard at work on AR technologies, and is likely to make screen-based AR a key selling point of the next iPhone, which is anticipated to have a near bezel-free design, which would certainly enhance the appearance of AR features.