Sony’s Magic Lab are set to demonstrate a new procedural animation system which it says allows VR characters found in VR to “imbue a sense of shared space with the player” with realistic eye, head and body movements.
GDC 2017 is underway, and as ever the event is brimming with intriguing talks on the subject of VR and subjects tangential to it. Magic Lab, a division within Sony dedicated to researching future advancements in the field of entertainment, has understandably been somewhat pre-occupied with the conundrums surrounding virtual reality.
At this year’s Game Developer Conference, it’s picked one of the more difficult topics that face developers of immersive entertainment (and in truth gaming at large), how do you make your in-game NPC’s react believably. Whilst this problem is not limited to VR gaming, due to the medium’s unique ability to conjure presence in players, sharing space with a virtual character means those NPCs are far more susceptible to appearing fake or uninteresting.
Magic Lab are due to demonstrate a new set of procedural systems which it says allow virtual characters to react not only to stimuli from the virtual environment, but more importantly cues from players. The demo will feature NPCs which can “react to sound/motion of the virtual environment as well as player sound/motion” and “interpret player head/hand pointing directions.” The NPCs then demonstrate virtual attentiveness by adjusting their “eyes, head, and body in a coordinated manner” whilst limited to their own specific field of view. The demo will allow the player to toggle various aspects of the demo characters on and off to gauge the effectiveness.
With social VR such a focus at the moment, with one of the world’s largest social companies firmly fixed on VR as the social platform of the future, solving ways to fool our acute human social senses are going to become extremely important. We’ll let you know how successful PlayStation Magic Lab have been as we’re at GDC all this week, and will hopefully see it in action.