MT7

Planetary Robotics and Vision Processing
Co-Conveners: G. Paar , N. Schmitz , K. Willner 
Oral Program
 / Thu, 06 Oct, 13:30–15:00 / Room Mars Room
Poster Program
 / Attendance Thu, 06 Oct, 17:30–19:00 / Poster Area
Add this session to your Personal programme

The theme for this session is: "Planetary Robotics and Vision Processing for Future Planetary Exploration". With the success of the NASA Mars Exploration Rovers and Phoenix Lander and several planetary exploration missions either being developed or planned for the future, this is an exciting and challenging time for Europe as it embarks upon its own plans and aspirations for planetary exploration.

A planetary robot can be regarded as an integral part of the 'planetary science apparatus', both as an instrument in its own right (e.g. using wheel motion and soil mechanics for physical investigations), and as a deployment device for instruments and surface/sub-surface sample acquisition. A major key to science sample selection is vision processing. Planetary rover imaging instruments such as panoramic and high resolution cameras, and the successful on-board and ground-based processing of this data is essential if mission science targets are to be realised.

The session will focus on the advances required to address challenges like autonomous localisation and navigation; real-time characterisation of terrain and obstacles; autonomous monitoring and responding to system health and safety; robustness and the ability to function in the presence of faults or anomalous unexpected conditions; autonomous and ground-based camera image processing, and a shift from a human directing the minute-to-minute mission surface operations activities to the planetary robot performing this directing autonomously.

Oral presentations and posters are solicited which present current and future research into planetary robotics and vision processing for planetary exploration. Papers will be especially welcome in the following areas:
- planetary robotics and vision processing mission experiences;
- novel sensors and imaging devices;
- in-flight and on-surface robotic and imaging instrument calibration methods;
- vision data fusion;
- planetary environment and terrain modelling techniques;
- simulation and image data visualisation methods;
- localisation and navigation;
- autonomous control and vision processing architectures;
- autonomous sample acquisition, novel locomotion methods including aerobots, submersible and sub-surface robots,
- analogue field trials addressing the above mentioned aspects.