Valve today introduced Motion Smoothing to SteamVR in beta. Similar to Oculus’ Asynchronous Spacewarp (ASW), the feature uses previous frames to synthesize new frames on the fly, allowing VR applications to continue to run smoothly and comfortably even when dropping frames.
PC VR headset demand relatively powerful gaming hardware because games need to be rendered at very high resolution, in 3D, and at high framerates. Doing all of that with very low latency and high consistency can be a challenge even for powerful gaming computers which may occasionally fail to render the next frame by the time the headset needs it, leading to a dropped frame. Without any intervention, a dropped frame will cause the headset will re-display the previous frame, which means the view in the headset looks like the world has momentarily attached itself to the head of the user. This can be very discomforting in VR, especially if several frames are dropped in a row.
One way of dealing with this issue is by using reprojection (timewarp, by Oculus’ naming), which, in the case of a dropped frame, shows the previous frame but reoritents it based on the user’s latest head rotation. That means what while anything moving in the game world will be motionless, at least the world still appears to respond to the player’s head movements, thereby avoiding discomfort.
But, this doesn’t account for positional head movement, and it still has the problem of the game being frozen during dropped frames, which means objects in motion in the game world will stutter or freeze.
Like Oculus with ASW, Valve is implementing an additional layer of performance protection for applications prone to dropping frames, one which can also account for moving objects and characters within the application.
Introduced today in beta by Valve’s Alex Vlachos, Motion Smoothing in SteamVR synthesizes entirely new frames to use in the place of dropped frames. It does so by looking at the last two frames, estimating what the next frame should look like, then sending the synthesized frame to the display instead of an entirely new frame. Presently Motion Smoothing is only available on NVIDIA GPUs and systems running Windows 10, though Valve says that AMD support is in the works. Motion Smoothing only works with the Vive, Vive Pro, and other native OpenVR headsets as other headsets (like the Rift and Windows VR) have their own approach to dealing with dropped frames.
Motion Smoothing kicks in automatically when SteamVR detects than an application is dropping frames. Like ASW, it cuts the application’s usual 90 FPS framerate down to 45, and generates a synthetic frame every other frame. That means that the user sees smooth 90 FPS motion in the headset, but the application has twice as much time to deliver new frames. Valve says that Motion Smoothing can be used even more aggressively if needed, synthesizing two frames or even three frames for every one real frame delivered by the application. You’ll also be able to disable it if you don’t want to use it.
Reducing or removing the requirement of VR applications to deliver consistent 90 FPS would make it viable for lower end hardware to run VR applications without major performance issues, and for higher-end hardware to run at higher resolutions or with greater graphical effects without suffering major performance issues. Combined with the Auto Resolution feature of SteamVR (which automatically optimizes the application’s render resolution to match the system’s GPU performance), Motion Smoothing could expand the range of hardware which can acceptably run VR experiences.
Valve’s Alex Vlachos tells Road to VR that Motion Smoothing is similar to Oculus’ ASW, but not identical.
“We feed the last two frames from the application to the GPU’s video encode chip to generate motion vectors (which are very rough), then [Valve and Oculus each have their own] methods for filtering those vector fields and applying them to the most recent application frame to generate a new frame,” he said. “So ASW, SteamVR Motion Smoothing, and WMR Motion Reprojection are just different implementations of the same high-level tech.”
Oculus recently introduced ASW 2.0, which aims to use depth information to improve the accuracy of synthesized frames. For Valve’s part, the company may make use of depth information at some point, but are mainly seeking generalized solutions.
“We have different approaches to reduce repeating pattern artifacts, and we have a few other methods internally that we may ship soon,” Vlachos said. “We are currently focusing our efforts on solutions [to synthesized frame artifacts] that apply to all applications, because as higher resolution headsets hit the market, our goal is to ensure customers can get as close to native resolution as possible with a wide range of GPUs.”
With Motion Smoothing just launched in beta, Vlachos notes that there will be much tuning to come based on user feedback, especially in the next few days, though he expects the feature will remain in beta for several weeks.