In space exploration, the autonomous landing of the space vehicle on the surface of a celestial destination under awide range of surface texture and lighting conditions is a challenging and formidable task. The space vehicle mustdetect and avoid hazards and touchdown softly and accurately at the intended location. Effective terrain relativenavigation can significantly improve the safety and success rate for all crewed and robotic space vehicles designedto land on the moon, other near-earth objects including asteroids and Mars. Precision landing is best accomplishedby collecting and fusing synergistic set of measurements providing surface situational awareness to refine theguidance and navigation solution leading to touchdown. Potential sources of independent and synergisticmeasurements need to be investigated for terrain relative navigation. In this paper, we discuss a process forestablishing the required sensing conditions and sensors for terrain relative navigation for space explorationapplications. We will present an adaptive fusion approach for generalized terrain relative navigation which isapplicable to various landing missions using different sensors. Our adaptive navigation approach provides timelyfusion of all measurements that may be available at any given time during the landing mission and has an abstractionsoftware layer implemented for all sensor measurements to enable plug-and-play of any sensors and IMU devices.Our adaptive navigation approach is a software solution that can be easily integrated on existing and futureplatforms supporting wide range of space landing missions. An overview of the simulation and the results of theflight demonstration of the methodology are provided in this paper.
展开▼