With the reveal of Sony’s 2015 Morpheus prototype, the company surprised the VR space with the announcement that the headset would support a 120Hz refresh rate, significantly higher than the 90Hz of the competition. We went hands-on with the company’s first native 120Hz demo, Magic Controller, along with Bedroom Robots from the same studio.

The announcement that Morpheus would run at 120Hz (meaning 120 frames shown to the user, per second) was a surprise because on the one hand, it means extremely smooth imagery and low latency, but on the other hand, it’s a daunting challenge to render high quality visuals at 120 FPS to match the screen’s 120Hz refresh rate.

sony 2015 morpheus prototype hands on (4)

Not every game will run natively at 120Hz. Developers who want to push graphical fidelity over blistering performance will have the option to render at 60 FPS and use the ‘Asynchronous Reprojection’ technique to output at 120 FPS. Asynchronous Reprojection is similar to Oculus’ Timewarp—reading the latest sensor data right before rendering, and making a slight adjustment to the viewpoint before ‘scan out’, where the display’s pixels light up in accordance with the rendered frame.

The ‘Asynchronous’ part refers to using this technique in a separate thread to create new frames with new sensor data from the latest frame completed by the rendering thread. Oculus has been using Asynchronous Timewarp on Gear VR, but hasn’t yet implemented it on PC.

Oculus’ Michael Antonov has a technical analysis of Asynchronous Timewarp which will familiarize you with the technique if you’d like to know more.

Reprojection isn’t a magic bullet though. While it can work effectively for distant objects, near field imagery can reveal rendering errors if the player is moving their head too quickly. Sony’s Chris Norden said at GDC 2015 that “if you have a gun waving around in your face you’re going to notice some artifacts.” Oculus’ Antonov agrees saying that these techniques have “limitations that developers should be aware of.”

SEE ALSO
'Hitman World of Assassination' Delayed on PSVR 2 to March 2025

Norden also noted that implementing Asynchronous Reprojection may be a challenge for developers. “If your engine isn’t super flexible, and multi-threaded, you might have some trouble with this.”

Magic Controller

One new demo, Magic Controller, shown at the 2015 Morpheus prototype reveal, was indeed running natively at 120Hz and was responsible for one of the most clear and sustained moments of Presence that I’ve felt yet. The demo was created by Sony Worldwide Studios, Japan.

See Also: HTC Vive and SteamVR Hands-on – A Stage of Constant Presence

Magic Controller put me into the ‘Morpheus Lab’, a circular chamber where I was standing in front of an empty table. Looking down at the DualShock 4 controller in my hands revealed an identical virtual version that was tracked in space just like the PlayStation Move controllers. As I pressed buttons and tilted the sticks on the real controller, the input was perfectly mirrored, though I would soon find out that the virtual controller could to much more than the real one.

sony 2015 morpheus prototype hands on (1)

I could hear clamboring sounds and feel vibrations coming from the controller; where the trackpad would normally be was a pair of blue robot eyes. I clicked on the trackpad and the eyes winced as I heard some giggly robotic chatter. Tapping the triangle button caused the trackpad to open like a hatch, and out popped the Bots onto the table. Anyone who has tried Sony’s AR showcase The Playroom (2013) on PS4 will be familiar with these joyful critters.

The Bots are energetic and noisy, jumping and jostling for attention and reactively moving their heads to follow me as I leaned about in my chair.

SEE ALSO
'Dig VR' Delays Quest Launch in Fear of Being Steamrolled by 'Batman: Arkham Shadow'

sony 2015 morpheus prototype magic controller (1)

At that point I saw some instructions pop-up around the buttons on the controller. As I pressed the circle button, I saw a MiniDisc tray open from the top of the controller and then slide back inside. With some vibrations, the bottom of the controller’s handle’s transformed into speakers and began bumping beats like a boom box, to which the Bots happily danced. The music coming out of the controller was positionally placed; as I moved it around my head I could hear the music as though it was coming out of the virtual controller itself.

Another button on the controller shut off the lights in the chamber and turned the controller into a flashlight. This is where a significant moment of Presence struck me. As I shined the light across the Bots and around the room, the scene felt incredibly real. There was something about the simple but seemingly accurate reflections on the shiny little Bots that brought me into that space and made me not want to leave.

sony 2015 morpheus prototype hands on (2)My guess is that three major factors helped bring me into this virtual moment. First would be the 120Hz refresh rate which may further convince my brain that the world around me is real, thanks to increased fluidity and reduced latency. Then there’s the true black color of the new OLED display which helped sell my brain on the idea that I was in a dark space, and finally, the ability to control the light and see it reflect on surfaces as I would expect.

This moment further underscores my long-held belief that accurate lighting models are much more important to apparent graphical realism than ultra high-resolution textures. One needs only to see how well a screenshot from 2007’s Halo 3 has stood the test of time to appreciate this. Developers looking to hit 120Hz for Morpheus will be smart to experiment with art direction that makes effective use of lighting instead of relying so much on textures and polygons.

SEE ALSO
HTC's Vive Ultimate Trackers Now Compatible with All SteamVR Headsets

Continue Reading on Page 2, Bedroom Robots

1
2

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • crim3

    No black smearing on Sony’s OLED display?
    About lighting, one day with the Assetto Corsa I stared at a shiny dashboard, seeing the highlight move accordingly to my head movements, and for a moment it felt like I was looking at something real. Correct lighting really tricks the mind into believing what it sees.

    • Ben Lang

      Ah yes, I forgot to make specific mention of this. Especially given the flashlight scene, I didn’t notice any black smear.

  • It must be super nice to have something you can physically hold be represented perfectly in VR :) That’s what it seems like for people who has tried the Vive at least, and picking the controllers from the helper, many comment on that. I guess this demo might feel similar to that. I look forward to a similar feel with the STEM… sometime… this year, I hope, haha.

    As for the flashlight, one of the things that completely blew me away back when using the DK1 and Razer Hydra with the Half-Life VR mod… was just that, to use the flashlight. Sure, the shadows were not always correctly projected, but most of the time the illusion worked, and it made the world instantly feel several times more real.

    Realistic light play is certainly one of the most powerful aspects to make a space feel physical. Both shadows and material shading. A low poly scene can still feel fantastically real with some fancy lighting effects. Mhmm…

    • Curtrock

      @Andreas: wish I could share your enthusiasm about the STEM. I am a backer, yet I’m beginning to not care anymore. With the announcement of SteamVR & the Lighthouse tracking system and the rapid evolution of various VR hand trackers, it feels like the STEM might already be relegated to the “niche” category. Still a great controller, but not entirely relevant anymore. When I backed their Kickstarter, they were the 1st & only input solution….a few short years later & ……However, if they can resurrect the HUGE potential that their “MakeVR” tech was pointing to, I might change my mind.

      • Heh, I’m also a backer :P For the five tracker kit nonetheless. SteamVR/Valve/HTC has shown their system with great potential, for sure, I’m super excited by that. If they also release some sort of module/pack to put on other things to track them, like a Haptech gun… exciting times for sure :P

        That said, I have experienced tracked hands with the Hydras, but being tethered by short cables were annoying. I hope the STEM will arrive early enough to give me at least six months of having a wireless Hydra set with increased range and little distortion before the Vive system releases :) And who knows, with ValveTime™ the release could be pushed back :x

        And indeed, software for VR will be the next frontier when hardware becomes commonplace.

    • Nick vB

      I still remember the light sabre from HydraCoverShooter, having the actual Hydra model in game was a great idea. I spent most of the time just drawing glowing patterns on the walls with it though! lol