EGU25-16985, updated on 25 Apr 2025
https://doi.org/10.5194/egusphere-egu25-16985
EGU General Assembly 2025
© Author(s) 2025. This work is distributed under
the Creative Commons Attribution 4.0 License.
Experimental Study of Image Matching Techniques for Co-Registration and Data Fusion of Orbital and Ground-Based Planetary Images
Antonia Schriever1, Klaus Gwinner1, Patrick Irmisch2, Thomas Kraft2, and Jörg Brauchle2
Antonia Schriever et al.
  • 1German Aerospace Center (DLR), Institute of Planetary Research, Planetary Geodesy, Rutherfordstr. 2, 12489 Berlin, Germany (Antonia.Schriever@dlr.de)
  • 2German Aerospace Center (DLR), Institute of Optical Sensor Systems, Rutherfordstr. 2, 12489 Berlin, Germany

This study aims to investigate the potential and feasibility of multi-sensor image integration to combine images from orbital or airborne platforms with rover-based images in planetary exploration. We use images taken with the airborne Modular Aerial Camera System (MACS) and a hand-held camera, the Integrated Positioning System (IPS), on a steep hillside on Vulcano Island, Sicily. Typical use-cases in planetary exploration are for robotic navigation, localization of exploration targets or improved target coverage combining additional viewpoints.

MACS was operated on a Vertical Take-Off and Landing (VTOL) drone. The images were used to simulate the orbital images overflying the hillside, while the IPS images were used to simulate a ground-based platform (e.g. rover). The aerial images were taken at a constant height level, resulting in different distances to the ground due to elevation changes in the terrain. The IPS images are taken with a 16mm lens camera mounted diagonally upwards relative to the other sensors on the system. The hillside is captured along a path that is approximately parallel to the steep scarp, maintaining a distance of at least 100 meters from it. Images are taken at multiple stops always covering the whole hillside with a swivel movement. This simulates different stereo angles to experiment with later during 3D reconstruction. The resulting dataset includes images from two different sensors with a high viewpoint discrepancy. We evaluate the performance of image matching when using this dataset regarding among other things varying scales, resolution and wavelength. Also, we investigate terrain specific influences on the matching quality as well as methods to overcome the viewpoint discrepancy between MACS and IPS images, resulting in large image parallaxes and local variation of the difference in image resolution. Lastly, we evaluate the quality of resulting 3D reconstruction as well as of the estimated intrinsic and extrinsic camera parameters and the possibility for their improvement.

We compare the results to earlier tests based on orbital planetary images, using various different matching methods. An evaluation with focus on feature-based matching approaches was done on images coming from the High Resolution Stereo Camera (HRSC) on ESA’s MarsExpress mission [1, 2]. Multiple performance metrics were analyzed which showed that different methods excel under different criteria. Also, it emphasized the need for a certain richness in features to ensure precise and accurate points. A majority of feature detectors reach subpixel accuracy, but only in feature rich images, while cross-correlation points as input to least-squares matching (LSM) [3] outperform otherwise.

[1] Schriever, A. and Gwinner, K., EPSC 2024, DOI: 10.5194/epsc2024-986

[2] Jaumann, R., et al. PSS 55 (7–8), 928–952 (2007) DOI: 10.1016/j.pss.2006.12.003.

[3] Gwinner, K., et al. PE&RS 75(9), 1127-1142 (2009), DOI 10.14358/PERS.75.9.1127

How to cite: Schriever, A., Gwinner, K., Irmisch, P., Kraft, T., and Brauchle, J.: Experimental Study of Image Matching Techniques for Co-Registration and Data Fusion of Orbital and Ground-Based Planetary Images, EGU General Assembly 2025, Vienna, Austria, 27 Apr–2 May 2025, EGU25-16985, https://doi.org/10.5194/egusphere-egu25-16985, 2025.