EGU General Assembly 2021
© Author(s) 2021. This work is distributed under
the Creative Commons Attribution 4.0 License.

Call to action: Pushing scientific and technological innovation to develop an efficient AI flood mapper for operational SAR satellites

Guy J.-P. Schumann1,2, Laura Giustarini1, Moh Zare1, and Ben Gaffinet1
Guy J.-P. Schumann et al.
  • 1Research and Education Department (RED), RSS-Hydro, Dudelange, Luxembourg (
  • 2School of Geographical Sciences, University of Bristol, Bristol, UK

There is no doubt that the devastating socio-economic impacts associated with floods has been increasing. According to the International Disaster Database (EM-DAT), floods represent the most frequent and most impacting, in terms of the number of people affected, among the weather-related disasters: nearly 1 billion people were affected by inundations in the last decade (2006–2015), while the overall economic damage is estimated to be more than $300 billion. Despite this evidence, and the awareness of the environmental role of rivers and their inundation, our capability to respond to and forecast floods remains relatively poor.

In this context, satellite sensors represent a highly valuable source of observation data that could fill many of the gaps, especially in remote areas and developing countries. In the last decade, with the proliferation of more satellite data and the advent of ESA’s operational Sentinel missions under the EC Copernicus open data programme, satellite images, in particular SAR, have been assisting flood disaster mitigation, response and recovery operations globally.

Although the number of state-of-the-art and innovative research studies in those areas is increasing, the full potential of remotely sensed data to enhance flood mapping has yet to be unlocked, especially the latency issue is not being sufficiently well addressed. Latency, i.e. the time between image acquisition to the flood map delivery to the person that actually needs it, is not at all in line with disaster response requirements and is, to a large extent, responsible for the slow uptake of EO-based products, such as flood maps, into an operational timeline or disaster response protocols of various potential user organizations, such as the UN World Food Programme for instance.

We call to develop a prototype or concept of a product. Specifically, a digital twin experiment should be developed first to generate a prototype AI-based algorithm that could be deployed onboard a SAR satellite to produce flood maps in real time. The mapping result, which consists of simple column/row (x/y) vector indices of flood edges in the form of a short “text message”, will be delivered to the field response teams via satellite communication technology for use within minutes, rather than many hours to days as is currently the case.

In this paper, we illustrate the concept of the proposed innovation, including the future possibility of in-orbit processing. This is in part a synthetic, proof-of-concept study, and, although the societal impact and value of the service prototype developed is clear, once successfully demonstrated, the economic value of this service as well as its market share and value can be established. At this point in time, there is no service of this type in existence. For optical/hyperspectral sensors on CubeSats, onboard AI-based processing has been trialed but not for SAR and not including a rapid flood map delivery service using AI. For instance, based on the results of an ESA-supported FDL Europe challenge, Mateo-Garcia et al. (2019) demonstrated the application of a fully convolutional neural network to prototype a GPU-based onboard flood segmentation system using degraded Sentinel-2 imagery (to mimic CubeSat capability).

How to cite: Schumann, G. J.-P., Giustarini, L., Zare, M., and Gaffinet, B.: Call to action: Pushing scientific and technological innovation to develop an efficient AI flood mapper for operational SAR satellites, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-5943,, 2021.

Display materials

Display link Display file

Comments on the display material

to access the discussion