NASA Looks to PlayStation VR to Solve Key Challenge of Space Robot Operation

6

While NASA has a long history of sending probes and rovers into space, advancements in robotics has made deployment of human-like robots an increasingly attractive prospect. But it turns out that controlling such humanoid robots remotely is challenging. NASA and Sony have been collaborating to explore how VR might be used to train operators to control robots in space.

While probes and drones are great at what they do, they are very specialized. The appeal of humanoid robots—those that mimic human form and dexterity—is their flexibility. Humans are amazing generalists, using our brains to achieve things that our bodies were never made for (like space travel). Part of what makes us so adaptable to diverse situations is our bipedal stance which frees up our arms, hands, and fingers for tasks (instead of locomotion), and the use of tools. Our hands can grip and manipulate a breadth of forms unmatched by machines—but robots are quickly catching on.

NASA’s Robonaut 2 is a humanoid robot designed with arms, hands, and fingers that move just like ours. But designing dextrous robots for space is only half the challenge to actually making them useful.

While NASA is highly experienced in controlling probes and rovers with carefully planned math-based maneuvers, human control is quick and intuitive; Robonaut 2’s human dexiety is wasted if commands can’t be executed with human fluidity and improvisation.

nasa-sony-playstation-vr-mighty-morphenaut
Sony’s Richard Marks demonstrates simulated control of a robot in space.

So NASA is exploring how to control humanoid robots with human input. Modern virtual reality, as it turns out, may provide the best way to do just that—by making the robot mimic the input of a remote operator—and NASA collaborated with Sony to create a PlayStation VR tech demo called Mighty Morphenaut to explore how this might work.

SEE ALSO
Android XR Creates a Major Dilemma That Will Make or Break Meta's XR Ambitions

“The hope is that by putting people in an environment where they can look around and move in ways that are much more intuitive than with a mouse and keyboard, it would require less training to understand how to operate the robot and enable quicker, more direct control of the motion,” Garrett Johnson, Software Engineer at NASA’s JPL told me.

The demo is pure simulation, running in real-time on a PS4, but it’s built to replicate the challenges that would actually come into play for humanoid robots in space, including the robot’s range of motion and the dreaded time delay.


So long as the human operator is sufficiently tracked—in this case using the PlayStation VR headset and Move controllers—robot control is actually quite simple as the robot can mimic the motions of the operator given its humanoid design. But even if the robot could keep perfect pace with the operator, the distances involved can introduce communication delays that cause lag between the operator’s input and the robot’s movements.

Compensating for this time delay is a huge challenge for effectively controlling humanoid robots in space when the operator is back on Earth. So the Mighty Morphenaut demo integrates a time delay mode where the user’s sees ‘ghost’ hands that move instantly, while the actual movement follows along after the fact.

“I’m pretty good at it because I’ve done it a lot,” said Richard Marks, head of Sony’s Magic Lab. “Usually when we put people in [at first with the time delay] they can’t do anything.”

And that’s exactly what would be hoped for; the instant feedback helps the operator make use of innate hand-eye coordination, and enough time training with the system can make a huge difference in their ability to compensate for the time delay, as Marks demonstrates.

SEE ALSO
Orion & Quest 3S Signal a New Era for Meta, Here's What it Means for the Industry at Large
DSTS Demonstrated by the Netherlands Army at the 7th US Army Joint Multinational Training Command
See Also: The Gulf Between High End Military VR and Consumer VR is Rapidly Shrinking

Marks told me that while this demo is a simulation, it should be entirely possible to overlay the ghost hand visualization onto real footage, making this technique a possible solution to one of the key challenges of operation dextrous space robots effectively.

But they’ll have to go further before it’s perfect. Johnson notes that one major piece of feedback from the tech demo is that the ghost hands enhanced the understanding of movement, but interaction was still difficult for objects in motion.

[With Mighty Morphenaut] we were able to explore a possible solution and I think our application worked well to demonstrate the problems of operating with delayed communication,” he told me. “However, even in our simulation, there are a still a number of problems to solve. With time delay, anticipating the motion of a floating object makes it nearly impossible to interact, so further research might include ways to help users predict that kind of motion.”

NASA, as it turns out, has been readily experimenting with modern VR tech to solve challenges relating to space exploration. The organization has explored uses for the Oculus Rift, Virtuix Omni and Microsoft’s HoloLens, among others, and of course has a long history of using VR systems of an earlier era for training.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • crim3

    Just because I like to be pedantic… I wonder what case of use they are aiming for, because time delay when operating a robot in Mars can widely surpass 30 minutes when in oposition. Maybe things like the Moon, I guess (only about 3 seconds delay).
    Not an expert, though. I’d be very glad to stand corrected if numbers are wrong.

    • Galbalan

      I’m only imagining a possible scenario here. After applying trained deep learning to a robot, a human on earth could use VR to look at a long delayed image of an environment captured in 3d. Then using VR, the scientist could simulate some complex maneuver that needs to be accomplished. Then send the expected simulation to the robot.
      Then the robot would have to deal with any on-the-fly changes and issues in the scenario with its current DNN and knowledge.

    • palasta .

      If those numbers are accurate isn’t that important since basically you’re right. However, this kind of remote control isn’t bound to be only used from a stationary base on earth. It can be very useful on manned mission to mars, where the crew controls robots from orbit or the surface. Or simply in earths orbit, when there is maintenance to be done.

  • Laer Carroll

    Crim3, you are quite correct. Moon round-trip communication suffers a minimum of slightly less than 3 seconds due to light-speed / distance limits. Plus there will be internal delays in electronic processing, a challenge which engineers like myself must solve. However, this does not make impossible operating a drone truck, construction vehicle, or humanoid robot on the Moon. It just means we have to find ways to compensate for the problem.

    Part of the solution is training operators as shown with the ghost hands approach. Part is giving robots partial autonomy. Aerial drones today, for instance, increasingly automatically take off and land.

    The Moon example is especially important. Before we go to Mars or other planets, It’s important for several reasons to establish a Moon base. Much or most of building the first habitat will be done by drones. It’s cheaper to get them to the Moon since they don’t require life support, incur no risk of human fatalities, and can be much smaller than human sized.

  • :-)
    The robot – avatar is only it: http://streltsovaleks.narod.ru

  • We are just a few short years away from revolution. The cyberdelic era!