Mars Perseverance Panoramic Image for Self-Determination Mission Algorithm
- 1Leiden University, Netherlands (swida@strw.leidenuniv.nl)
- 2Leiden University, Netherlands (foing@strw.leidenuniv.nl)
- 3Leiden University, Netherlands (vleugels@strw.leidenuniv.nl)
Aiming to unravel the astrobiology of Mars, the Perseverance mission came with a lot of unknowns. With the surface level knowledge that we have already known, The High Resolution Imaging Science Experiment (HiRISE) can already determine the observation or experimental sites through the images generated from the orbiter. Although the resolution is high, with the power of a 1-meter-size object determinator, we can always expect so much more from the ground-level observation.
The Mars Perseverance rover is equipped with a pair of Mastcam-Z set cameras that are equipped in a manner to simulate the human eye for depth determination in image processing. The instruments can process stereo colour images of the ground level. These images can be used to make detailed maps of the Mars surface scenery at ground level with high precision.
Building and analyzing these images can take days to process on Earth manually. But if we utilise machine learning tools and onsite computation, it might save a lot of time for the mission. The current model used in the Mars Perseverance is the AutoNav Mark 4 with a lot of tasks, including spacecraft positioning, in-flight orbit determination, target tracking, and ephemeris calculations. All those might be computationally expensive to process. Therefore, the aim of this research is to develop a simple algorithm to do object and slope determinations to feed into an autonomous path determination process. The data fed into the algorithm are panoramic images captured by the MastCam-Z mounted on Mars Perseverance.
How to cite: Swida, O. B., Foing, B., and Vleugels, C.: Mars Perseverance Panoramic Image for Self-Determination Mission Algorithm, EGU General Assembly 2023, Vienna, Austria, 24–28 Apr 2023, EGU23-16941, https://doi.org/10.5194/egusphere-egu23-16941, 2023.