NASA: A.I. for navigation on the Moon

A system developed by NASA engineer Alvin Yew would provide an A.I.-based localization service for navigating on the Moon
Gathering ridges, craters, and boulders that form a lunar horizon can be used by an A.I. to accurately locate a lunar traveler. A system developed by research engineer Alvin Yew would provide a backup localization service for future explorers, whether robotic or human. NASA

Landmarks can provide explorers with orientation when their GPS devices lose signal. A NASA engineer is training an A.I. to use lunar horizon features for navigation on the Moon.

GPS for Future Explorations

“For safety and scientific geotagging, it’s crucial for explorers to know exactly where they are while navigating the lunar landscape” stated Alvin Yew, a research engineer at NASA’s Goddard Space Flight Center. “Equipping an onboard device with a local map would support any mission, whether robotic or human”.

NASA is collaborating with international agencies to develop a communication and navigation framework for the Moon. LunaNet will bring Internet-like capabilities to the Moon, including localization services. However, explorers in certain regions of the lunar surface might require solutions from multiple sources. In cases where communication signals are unavailable, this would ensure greater security.

“Having reliable backup systems is fundamental for human exploration of other celestial bodies” Yew mentioned. “Motivation is to enable the exploration of lunar craters, where the entire horizon would be the crater’s edge”.

LOLA Data

Yew began using data from NASA’s Lunar Reconnaissance Orbiter, specifically the Lunar Orbiter Laser Altimeter (LOLA). The instrument measures lunar surface slopes, roughness, and generates high-resolution topographic maps. Yew is training an A.I. to recreate lunar horizon features as they would appear to an explorer, using LOLA’s digital models. These digital panoramas can be used to match boulders and ridges with those visible in images captured by a rover or astronaut, leading to precise location identification.

“Conceptually, it’s like going out and trying to figure out your location by observing the horizon and surrounding landmarks” Yew explained. “While a rough estimation of position might be easy for a person, we aim to demonstrate accuracy on the ground within 9 meters. This precision opens the doors to a wide range of mission concepts for future explorations”.

According to research published by Erwan Mazarico, a lunar explorer can see up to 300 kilometers from any unobstructed position on the Moon at most. Even on Earth, Yew’s localization technology could assist explorers on terrains where GPS signals are obstructed or subject to interference.

Optical Navigation GIANT

Yew’s geolocation system will leverage the capabilities of GIANT (Goddard Image Analysis and Navigation Tool). This optical navigation tool, primarily developed by engineer Andrew Liounis, has already been used for navigation data in NASA’s OSIRIS-REx mission to collect a sample from the Bennu asteroid.

Unlike radar or laser tools, GIANT rapidly and accurately analyzes images to measure distances between visible landmarks. Portable cGIANT version is a derivative library Autonomous Guidance and Navigation Control (AutoGNC) system. Latter provides mission autonomy solutions for all phases of spacecraft and rover operations. Combination of NASA ‘s A.I. interpretation of visual panoramas with a known terrain model could provide future explorers with a powerful tool for navigation on the Moon or other planets.

Subscribe
Notify of
guest
0 Commenti
Inline Feedbacks
View all comments