The systems would entail an AI system known as a Hopfield neural network that uses processes closely mimicking human thought to weigh evidence and make decisions based on previously known facts and patterns. Using digital eyes incorporated into astronauts' suits, the AI system would collect data from the environment and analyze it in the Hopfield networks located on the hips of the suits.
Preloaded data as well as data collected as the astronauts go about their Martian surface walks would be turned over in the AI systems much in the same way a human brain would crunch it. For instance, the Hopfield's algorithm can learn colors from a single image, then relate it to previously observed instances of that color, making connections between the two. Recent tests of a complete, wearable prototype suit at the Mars Desert Research Station in Utah found that the AI could tell the difference between lichen and the rock surrounding it.
But that's just scratching the surface; next, researchers plan to teach the Hopfield to differentiate between textures, and ultimately to engineer the system to work at scales ranging from wide landscapes to the minuscule. They have plenty of time to do so, as no one plans to send a manned mission to Mars any time soon. But the data the algorithm is already learning on Earth could ride with robotic missions to Mars in the more immediate future.