With no way of generating resistive force feedback with today’s VR motion controllers, how can we make users feel and behave as though virtual objects have varying weights? Creative agency B-Reel explored several approaches and open sourced their experiments for others to learn from.


Guest Article by B-Reel

b-reel-logoB-Reel is a creative agency. Founded in Stockholm in 1999, we are now an extended family of over 170 designers, developers, writers, strategists, and producers across 6 offices worldwide. Love of craft and technical innovation fuels our search for the unexpected. Our process is flat, fast, and fun—maintaining integrity from thinking to making. B-Reel strategizes, concepts, designs, codes, fikar, animates, directs and solves problems.


Throughout the past year, we’ve loved working in virtual reality — from inhouse projects to our work with Google Daydream for IO 2016. We’re firm believers that the future of VR is bright and ripe with opportunities for exploration.

lucid trips vr (1)
See Also: The Fearless Creativity of ‘Lucid Trips’ is Exactly What VR Needs to Thrive

In our past VR endeavors, we created fully contained experiences to help us better understand the workflow and production requirements of the medium. But this time around, we wanted to dive deeper into some of the nitty-gritty interaction principles that drive experiences in VR. We were particularly inspired by the Daydream team’s approach toward a more utility-driven VR platform, and the interactions that are going to be required to make that possible. So we decided to start small by isolating a specific interaction to explore.

Choosing a Focus: Visual vs. Physical Interaction Models

Right now, most utility VR interfaces are anchored by simple 2D planes floating in space. But on the gaming side of things, there’s been a clear shift towards less visual, more physical interactions. See Cosmic Trip, which brilliantly uses physical buttons to navigate its item menu; or Job Simulator, which almost entirely eliminates the old point-and-click interaction paradigm.

SEE ALSO
VR Comfort Settings Checklist & Glossary for Developers and Players Alike

[gfycat data_id=”CriminalImpartialGoldenmantledgroundsquirrel” data_autoplay=true data_controls=false]

Cosmic Trip (Left) Job Simulator (Right)

We brainstormed around this premise and found ourselves drawn to the idea of artificially simulating weight when interacting with objects in VR. Specifically, could we throttle the speed of a user’s actions when picking up objects to make things feel heavier or lighter? Heavy objects would need to be picked up slowly, while light objects could be picked up normally. Not a new idea, and perhaps not directly applicable to the context of utility, but one we felt would be fun to explore.

We hoped to come out of this exercise with findings that could apply to all platforms with motion controls: from Daydream’s Wiimote-style controller, to the advanced room-tracking controllers of the Vive and Rift. With this in mind, we decided to design for the most advanced platform (until Oculus Touch launches, that’s the Vive), and later explore ways to simplify for Daydream, and eventually even gaze-based controls on the Gear VR and Cardboard.

vr-motion-controllers
VR motion controllers

As far as production software, we were torn between our familiarity with Unity and the rendering potential of Unreal. We stuck with Unity for now, and hope to explore Unreal more in future explorations.

Defining Goals

Our previous VR projects were rather off-the-cuff, and left us little to reuse going forward. So we entered this exploration with some high level goals. If nothing else, we’d have some measure of success if we at least addressed the following:

  • To develop our collaborative workflow within Unity and get more of the team comfortable with that workflow.
  • To build a basic “boilerplate” environment for us to use internally, so future VR experiments could get up and running quickly.
SEE ALSO
'Batman: Arkham Shadow' Behind-the-scenes – Insights & Artwork from Camouflaj

The Process

With our direction and goals decided, we assembled our team :  three 3D/motion artists, two designers, and a creative technologist. We used Git for collaborating and sharing assets across machines. Scenes could be “checked out” and edited by one person at a time, while others could work on prefabs that would be merged into the master scene. At such a small scale, this approach worked for us, but we’re actively exploring ways this could scale to larger teams and projects without getting messy.

Step 1: Becoming Pick Up Artists

You can’t explore weight without first nailing down the basic physics of how objects can be picked up and moved around, and we quickly found that these concepts were one in the same. Like most things in VR, there isn’t yet a consensus in the industry on the ‘correct’ way to handle this behavior. So we explored a bunch of options, which naturally grouped themselves into two categories: direct links (simple parenting, creating a fixed joint), and loose links (adjusting velocity or using forces to attract objects towards the controller). The type of link defined the methods we employed to simulate weight.

[gfycat data_id=”VapidShockingIcefish” data_autoplay=true data_controls=false]

Direct Link

In a direct link, objects match the controller’s motion one-to-one. If the controller moves too quickly for the object’s mass, the link is broken and the object falls to the ground.

[gfycat data_id=”SentimentalEvilKrill” data_autoplay=true data_controls=false]

Loose Link

For loose links, objects have different strengths of attraction towards the controller depending on their weight. Light objects react quickly, with the feeling of a direct link. Heavier objects will lag behind the controller and require more ‘effort’ to lift. We didn’t expect this to work well — the object breaks 1:1 tracking, which is a pretty big no-no in VR — but it surprisingly felt very promising. We chalk this up to two things:

  1. We still show the controller (which maintains 1:1 tracking) while lifting, avoiding the feeling that the user is not directly controlling the environment.
  2. Once the object reaches the controller, we snap it and form a direct link. We added this mechanic after finding that the feeling of weight was most effective during the “picking up” action, and afterwards only served as a distraction to the user.
SEE ALSO
Hands-on: Shiftall MeganeX Superlight Packs a Wishlist of Ergonomics Into a Tiny Package

Continued on Page 2 >>

1
2
3
Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


  • Roshawn Terrell

    after playing nvidias VR funhouse and using the Bat, i was absolutely blown away. When i spacked the bat into the table, it literally felt as if i was spacking it agaisnt a table. I simply cannot sufficiently describe this experience with words. Nvidias VR funhouse reignites hope in me that VR PVP physics based sword fighting could actually be possible.

    Much like the same way i thought sufficiently recreating the sensation of firing a bow and arrow with motion controllers was rediculous and absolutely not possible, but was completely proven wrong and blown away by Valves The Lab experience. I am again proven wrong by Nvidias VR funhouse.
    0 Comments Subscri