3 minute read
Steering by Landmarks – On the Moon
Next Article
Much like how familiar landmarks can give travelers a sense of direction, a NASA engineer is teaching a machine to use horizon features to navigate on the Moon.
Goddard research engineer Alvin Yew, started with digital elevation models from sources such as the Lunar Orbiting Laser Altimeter (LOLA) aboard the Lunar Reconnaissance Orbiter. He then uses those models to recreate features on the horizon as they would appear to an explorer on the lunar surface. Those digital panoramas can be used to correlate known boulders and ridges with those visible in pictures of the lunar horizon taken by a rover or astronaut, providing accurate location identification for any given region.
“Conceptually, it’s like going outside and trying to figure out where you are by surveying the horizon and surrounding landmarks,” Yew said. “While a ballpark location estimate might be easy for a person, we want to demonstrate accuracy on the ground down to less than 30 feet. This accuracy opens the door to a broad range of mission concepts for future exploration.”
Making efficient use of LOLA data, a handheld device could be programmed with a local subset of terrain and elevation data to conserve memory. According to work published by Goddard researcher Erwan Mazarico, a lunar explorer can see at most up to about 180 miles (300 kilometers) from any unobstructed location on the Moon. Even on Earth, Yew’s location technology could help explorers in regions where Global Positioning System (GPS) signals are not dependable, or during solar storms that interfere with GPS accuracy.
“Equipping an onboard device with a local map would support any mission, whether robotic or human,” Yew said. “For safety and science geotagging, it’s important to know exactly where they are.”
Yew’s geolocation system will leverage the capabilities of GIANT, the Goddard-developed Image Analysis and Navigation Tool. This optical navigation tool developed primarily by Goddard engineer Andrew Liounis independently verified navigation data processed by the primary navigation team for the Origins, Spectral Interpretation, Resource Identification, Security, Regolith Explorer, or OSIRIS-Rex (CuttingEdge, Summer 2021, Page 15).
In contrast to radar or laser-ranging tools that pulse radio and light at a target to analyze the returning signals, GIANT quickly and accurately analyzes images to measure the distance to and between visible landmarks. cGIANT is the portable, flight version of GIANT, a derivative library to Goddard’s autonomous Navigation Guidance and Control system, or autoGNC, which provides mission autonomy solutions for all stages of spacecraft and rover operations.
In addition to Yew’s technology, isolated regions on the lunar surface may require overlapping solutions derived from multiple sources to assure safety.
To support lunar exploration, NASA is working with industry and other international agencies to develop an interoperable communications and navigation architecture on the Moon: LunaNet. LunaNet will bring “internet-like” capabilities to the Moon and provide astronauts, rovers, orbiters, and more with networking, navigation, and situational awareness services.
“It’s critical to have dependable backup systems when we’re talking about human exploration,” Yew said. “The motivation for me was to enable lunar crater exploration, where the entire horizon would be the crater rim.”