EGU23-16793, updated on 22 Nov 2023
https://doi.org/10.5194/egusphere-egu23-16793
EGU General Assembly 2023
© Author(s) 2023. This work is distributed under
the Creative Commons Attribution 4.0 License.

Flood rapid mapping for immediate response: from semi-automatized delineation to AI-derived estimations

Sébastien Delbour, Christophe Fatras, and Vera Gastal
Sébastien Delbour et al.
  • CLS, Villeneuve d Ascq, France (sdelbour@groupcls.com)

In the frame of the Copernicus Emergency Management Service - Rapid Mapping, reliable flood maps must be delivered to users within six hours from the availability of remote sensing data. This data can be of different types, either from optical or SAR datasets, which all present different properties (wavelengths, band availability, resolution, etc.). This production is currently performed using semi-automatic methods and processes, to avoid misclassification and provide flood maps as accurate as possible. In order to improve service delivery performances including for covering very large areas, there is a need of an accurate automatically produced first guess, to eventually be modified manually. This is the main reason why the use of AI to learn and detect flooded areas is explored here for both optical and SAR data. The FloodML project used a random forest approach mixed with an in-house learning database to assess flood maps from both optical and SAR datasets. It showed good results, and can cover automatically a 10,000 km² area in a few minutes only. The success of this first approach led to both FloodDAM and FloodDAM-DT projects. These follow-ons now focus on the detection of water height level irregularities in local river gauges, to then produce flood maps if needed, to potentially lead to a modelling of the flood event evolution through data assimilation.

How to cite: Delbour, S., Fatras, C., and Gastal, V.: Flood rapid mapping for immediate response: from semi-automatized delineation to AI-derived estimations, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16793, https://doi.org/10.5194/egusphere-egu23-16793, 2023.