X

Image courtesy VRChat

One of VR’s Biggest Social Hangouts is Getting Upgraded Avatars with More Convincing Motion Tracking

Among social VR platforms, VRChat is known for being remarkably flexible thanks to how it lets its users import custom avatars built using standard game development tools. Now the company is releasing an update that makes avatars even more realistic in how they move.

VRChat announced it was bringing a “completely revamped” inverse kinematics (IK) system to the platform, which is used to estimate in-game body positions relative to the user’s physical movement. If you’ve seen an avatar’s elbow bend weirdly, or their legs drag on the floor in a strange shuffle, there’s an IK system there trying to compensate for the lack of sufficient tracking data.

The update has been in beta for a while now, so you might have seen other users sporting some form of the refreshed IK system lately, however this update makes it broadly available to the entire userbase.

The studio says its VRChat IK 2.0 system now includes improved elbow positioning, better motion and pose handling, up to eleven-point tracking, and calibration saving for easier setup for users who have multiple Vive Trackers for more detailed full-body tracking.

New options in the update also include the ability to measure avatars by height, lock-in a viewpoint for better body sizing, knee tracking, chest tracking, and both elbow and shoulder tracking for users with standard two-controller kit and not an array of Vive Tracker pucks. You can check out the entire changelog here for more info on exactly what’s in the new update.

Released on Steam Early Access in 2017, the free social VR app is still going strong, with an estimated 60,000 – 80,000 players regularly connect per day, according to an unofficial VRChat metric accounting for both Steam and non-Steam users.

Update (12:30 PM ET): We’ve replaced the SteamCharts data with a more accurate unofficial source, which we’ve linked in the body of the article. Thanks go to PeterJCasaey for pointing us to the source!

Related Posts
Disqus Comments Loading...