How Can AI Assist Humans in Space Exploration? NASA Astronauts Tell Us What They Think.

Part 1: Human Biology and Helpful Robots

By MaryLiz Bender 

AI Astronauts
From left to right: Nicole Stott, Tom Jones, Story Musgrave. Courtesy of NASA

During my time as a User Experience professional, I learned to never make any assumptions. If you’re trying to solve a problem, go to the source — conduct your research with the end user. So, when I was asked how Artificial Intelligence (AI) could assist humans in space exploration, I redirected the question to experienced NASA Astronauts Nicole Stott, Thomas Jones, and Story Musgrave.

Story Musgrave is a veteran of six space shuttle flights with more than 1,280 hours in space. Nicole Stott flew two spaceflight missions including a spacewalk and a 91-day mission, working with scientific research aboard the International Space Station (ISS). Thomas Jones spent a collective 53 days in space during four space shuttle flights. 

In three separate interviews we explored the tedious, redundant, and dangerous tasks that could be best served by AI, and what AI could mean for our future.

N A S A astronaut Jack Fischer
In this May 2017 photo, NASA astronaut Jack Fischer works outside the U.S. Destiny laboratory module of the International Space Station. Courtesy of NASA

Having spent extensive time in space, Story Musgrave is intimately familiar with the types of challenges spacefarers face, both mental and physical. “In terms of the biological system, obviously we were not designed to be there,” he points out. We discussed a few ways that AI could help condition humans on their long-duration space flights. In fact, Story designed and proposed a pre-flight conditioning system 30 years ago, though it has yet to be acted upon. He explains, “Seventy percent of astronauts get ill going into space. Fifty percent are ill to the point of vomiting. ... The cause is that the system is unable to cope with the free-fall, zero-G condition after four billion years of evolution here.”

Tom Jones reminded me that many missions require an astronaut to complete long, tedious, multi-step tasks. He suggests a “robot could keep track of where you are in the procedure and verbally allow you to ask, ‘What’s the next step?’ Or, have the robot keep track of how far you’ve gotten into the procedure so if you’ve gotten off on a tangent, or are dealing with a problem, or making a great discovery — when you come back to it, the robot remembers where you are or how far you’ve gotten.” This could be especially useful when we’re completing missions in deep space, or on another planet when we can’t quickly communicate with mission control.

Story Musgrave was on a similar wavelength. “If you add AI to that, then you get a friendly relationship between the biological system and the digital system, ... [and] the digital system gets to know the human that it’s playing with. It gets to know it’s habit patterns, ... ‘How does this person react, how do they try to control me, the digital system? How can I be friendly? How do I get to know them?’’ How would you utilize Deep Learning speech recognition to give the AI the ability to have a two-way conversation with astronauts, and learn from their conversations?

Chris Cassidy
In the International Space Station's Destiny laboratory, NASA astronaut Chris Cassidy, Expedition 36 flight engineer, wears tele-operation gear to telerobotically test Robonaut 2's maneuvers. Courtesy of NASA

Nicole Stott also expanded on the importance of smart interfaces, stating: “The day-to-day operation of the station, or a habitat on the moon or mars, or a spacecraft on an extended-duration journey between destinations, will require an intelligent computer system to keep it going.” She adds, “Even on the ISS, we are primarily relying on the computers and MCC interface to support daily operation — [the] crew really only interfaces when a physical task is required.” Nicole continued with other potential areas AI programs could assist with, including "emergency equipment surveys, logistics inventories, air/water quality measurements and evaluation, [and] forward team support for emergency alarm evaluation.” Tom Jones extends that list to encompass “periodic readings of noise levels or C02 levels, ... temperature readings, [and] inspections for airflow volume where you have to have sufficient ventilation to provide the crew with oxygen.” 

In the monitoring systems described, there are a slew of possible applications for AI. Potentially, we could use AI to test the onboard air and water quality, or to detect and monitor noise levels. Enhanced computer vision and object detection could be used to make the emergency equipment evaluation process more efficient. Can you envision how traffic light detection algorithms and neural networks could be adapted to help identify the issues on board a space station?

Tom went on to describe the complexities involved in adjusting the camera and lighting while grabbing a cargo ship arriving at the space station: “So, you’ve got your hands on a set of hand controllers, … but you need to have different camera views, as lighting conditions change or as the spacecraft you’re trying to grapple is changing position.” He thinks this is another great application for an intelligent assistant with “voice-activated camera controls and lighting controls” that would react to voice commands such as, “Camera #4, pan left, stop. Zoom in now, stop. Adjust lighting. Open iris/close iris, give me more or less light in the view.” We could use voice recognition to allow the astronauts to control a myriad of devices, or perhaps save the oxygen supply if we teach these programs to learn from their experiences, and anticipate the proper actions.

Samantha Cristoforetti
European Space Agency astronaut Samantha Cristoforetti works with a pair of Synchronized Position Hold, Engage, Reorient, Experimental Satellites, or SPHERES, on the International Space Station. Courtesy of NASA

Tom Jones pointed out that, 

“The configuration of a humanoid in a free-fall environment is just arbitrary.When you’re moving around the space station or weightless spacecraft, that’s not necessarily the best configuration for latching on.” Nicole agreed, saying, “A robot like Robonaut has limited utility inside a space station — and legs don't make it more usable. When I think of a robot that would work best inside the ISS, something like the SPHERES units comes to mind — could be very quick to deploy, [and] could surveil in areas that either require repetitive surveillance or in places where an emergency situation is being evaluated or responded to (in order to avoid sending the people into a potentially dangerous situation).”

Tom Jones looks to Sci Fi to offer alternative suggestions:

“Maybe you want to be a spider with a bunch of end effectors that can latch onto things. And you don’t want to be human-sized, if you’re moving around a space station; … the last thing you need is somebody crowding you. So, if that robot can be small, the size of a basketball, and hover on your shoulder, that seems to me much more useful and flexible than having a large humanoid robot. ... You do need a way to anchor. A free-flying robot is neat, but you don’t want it to have to constantly be refueled with nitrogen or compressed air, or having fans running all the time making noise and blowing on your face while you’re working. … [I]t can fly over to where you are and grab on with an arm, like an R2D2 kind of stick ... to grab onto something.”

Of course, our robot assistants should be designed for the task at hand. During Earth observation missions, Tom Jones was tasked with a good deal of science photography. “Things fly by fairly quickly, so you might have only a 30-second window to get a picture of the target and you want the optimum angle.” He had to hold a map in his hand while looking out the window, trying to orient his perspective with the map and find his target. “[I]f you could have a robot cueing you where to look, so the robot would be aware of which window you’re looking out of, they would say, ‘Okay, the target is going to appear here based on your body orientation. … Look at the ten o’clock position, twenty degrees off the center line.’ ... You could imagine putting on some clear glasses that have some kind of a heads-up display ... [showing] a circle where the target is going to appear.” From simply snapping photos of targets to alerting mission control notable events on the ground, we could fully automate this process via Deep Learning for remote sensing. Perhaps this could remove the need for any kind of physical human interface.

As Story Musgrave put it, “Humans have spent billions of years evolving and adapting to conditions on earth”. Unlike us, our robot friends can be perfectly designed for such an environment, helping us explore space further, quicker, and in a more cost-effective way.

In the second part of this series we will continue this discussion, focusing on how Intelligent robots can help us colonize mars and worlds beyond.   Go to Part Two 

In the meantime: How else do you think we could use AI to explore space?

About the Author

MaryLiz Bender

MaryLiz Bender is a user experience designer, digital content coordinator at The Planetary Society, and associate producer for the Planetary Radio podcast. She dedicates her life to space outreach and education, engaging the public through music, art, and her mobile observatory.

Website: http://maryliz.me

For more complete information about compiler optimizations, see our Optimization Notice.

1 comment

Top

Add a Comment

Have a technical question? Visit our forums. Have site or software product issues? Contact support.