X

Image courtesy Epic Games

Epic’s HoloLens 2 Demo ‘Apollo 11: Mission AR’ Showcases Impressive PC-quality Graphics

Epic Games today released a new video featuring a demo for HoloLens 2 that aims to show off just what sort of graphics can be achieved on Microsoft’s latest standalone AR headset. Called Apollo 11: Mission AR, the interactive demo is streamed wirelessly in real-time from networked PCs running the company’s game engine, Unreal Engine.

Unveiled earlier this summer at Microsoft Build 2019, Apollo 11: Mission AR is a recreation of the historic 1969 Apollo 11 mission and lunar landing, showing off the Saturn V’s launch, a reenactment of the lunar landing, and Neil Armstrong’s first steps on the Moon, which Epic says was reconstructed based on data and footage from the actual mission.

Epic says the demo features 7 million polygons in a physically-based rendering environment, and includes fully dynamic lighting and shadows, multi-layered materials, and volumetric effects.

Image courtesy Epic Games

That isn’t done on-device though. To achieve this level of detail, Epic says the experience’s “holographic elements” are actually streamed wirelessly in real-time from networked PCs running UE 4.23, the current version of Unreal Engine.

According to Epic’s HoloLens 2 streaming guide, the headset sends eye tracking, gesture, voice, current device pose, and spatial mapping input to your PC, and then streams rendered frames back to HoloLens 2. This, the company says, is designed to boost app performance, and make development easier since devs won’t need to package and deploy the app on-device before running it, however it’s clear it also allows HoloLens 2 to play host to more graphically involved experiences than were originally intended for the standalone device’s on-board processors.

Image courtesy Epic Games

We reached out to Epic to see whether this could also be achieved via cloud streaming, or if it’s a local machine-only implementation. We’ll update this article as soon as we hear back (see update below).

Released in early September, Unreal Engine 4.23 is the first iteration of the company’s game engine to feature production-ready support for HoloLens 2, which includes tools such as streaming and native deployment, emulator support, finger tracking, gesture recognition, meshing, voice input, and spatial anchor pinning.

Outside of the demo’s visual polish, Epic says Apollo 11: Mission AR also shows support for UE4 Composure, color temperature, and post-processing, plus OCIO LUTs, I/O for AJA video systems, and additional features that streamline mixed reality media production.

Update (2:00 PM ET): An Epic Games spokesperson has left us with this statement regarding cloud rendering for remote PC-to-HoloLens connections:

“While it is technically possible to use the HoloLens 2 Remoting over the Internet, we would strongly recommend against it due to significant latency and uncontrollable network conditions. When using HoloLens 2 Remoting, you should always aim to use a local network to minimize the latency and ensure there are minimal other devices connected to it to maximize the bandwidth available for the HoloLens 2.”

Related Posts
Disqus Comments Loading...