The distance that separates the Earth from Mars is 102 million kilometers, while the Moon is 384,400 kilometers. A mishap at such a distance can be fatal for astronauts. To minimize any flaws, Microsoft trained an artificial intelligence (AI) that detects flaws in the suits.
Suits, helmets and gloves are essential garments for astronauts to carry out their space explorations and face a series of difficulties such as temperature variations, increased radiation and vacuum.
By using them for a long time and constantly, the garments can wear and tear by friction. A small leak in its structure can lead to physical problems that are very difficult to solve far from Earth.
Given this scenario, NASA must photograph the spacesuit gloves during and after each walk and send them for inspection. From Earth, NASA scientists examine photos for damage and then send the results to astronauts on the ISS (International Space Station).
The drawback is the delay of the round trip. A “Hello” from space to Earth takes up to 20 minutes to get there, and vice versa. Which implies that communication takes about 40 minutes. Too much waiting time for the astronaut, who must determine if his glove is suitable for the mission.
To solve this, a Microsoft team working with NASA scientists and Hewlett Packard Enterprise (HPE) engineers is developing a system that uses AI and a catalog of gloves.
“What we’ve shown is that we can perform AI and processing in the ISS environment and analyze gloves in real time,” said Ryan Campbell, software engineer at Microsoft Azure Space.
To create the “onboard glove monitor,” the NASA team stored new gloves in good condition and gloves that were worn during spacewalks and ground training.
In turn, they photographed the gloves that showed damage to label specific types of wear: Areas where the outer rubber coated silicone layer had started to peel off or places where the Vectran coating was compromised.
This was done using the Azure Cognitive Services Custom Service: NASA engineers opened the glove images in a web browser and clicked on examples of damage.
This information was used to create a Microsoft Azure cloud base and the results were compared to actual damage reports and images from NASA. The tool generated a probability score for the possibility of damage to a particular place on the glove.
“Because we are literally next to the astronaut when we do the processing, we can run our tests faster than images can be sent back to Earth.”
At the conclusion of the spacewalk, crew members take pictures of the astronauts’ gloves as they remove their spacesuits in the airlock.
These images are immediately sent to HPE’s Spaceborne Computer-2 aboard the ISS, where the Glove Analyzer quickly searches for signs of space damage. If errors are reported, a report is immediately sent back to Earth for NASA teams to review.
Currently, the damage assessment tool developed by NASA, Microsoft and HPE is in a testing stage. This means that it runs analytics on the gloves, but it is not used to make critical security decisions.
While this project is new, NASA is looking at other ways to extend the technology to more areas where it could look to locate damage to other critical components, such as docking adapters.
In addition, it is possible that Microsoft HoloLens 2 could help astronauts quickly perform a visual scan for damage to gloves, or even facilitate assisted repairs on complicated machinery.