EGU24-16528, updated on 09 Mar 2024
https://doi.org/10.5194/egusphere-egu24-16528
EGU General Assembly 2024
© Author(s) 2024. This work is distributed under
the Creative Commons Attribution 4.0 License.

Urban Flood Extent Extraction and Type Recognition Based on Multiple Convolutional Neural Networks

Ziming Wang1 and Ce Zhang1,2
Ziming Wang and Ce Zhang
  • 1School of Geographical Sciences, University of Bristol, Bristol BS8 1SS, UK
  • 2UK Centre for Ecology & Hydrology, Library Avenue, Lancaster LA1 4AP, UK

With the increasing frequency of global extreme weather events and urbanization accelerates, the probability of urban flooding has significantly increased, posing a grave threat to both property and lives. Creating accurate flood maps is a critical component of effective emergency management for urban floods. However, current research primarily focuses on the extent of urban flood, with little consideration given to its type. Different types of floods often have varying water components and sediments, necessitating the identification of flood types during mapping to provide targeted relief. This paper proposes a method using multiple Convolutional Neural Networks (CNNs) that combines U-Net and ResNet architectures for urban flood extent extraction and type classification. The proposed method achieved 97.1% accuracy in flood extent extraction and 91% accuracy in flood type classification, demonstrating its accuracy in urban flood mapping. Furthermore, the research was validated using a global dataset, covering six continents and 20 countries, encompassing samples with diverse dimensions and geographical features, showcasing the robustness and practicality of the model in various regions.

Keywords: Urban flood mapping, Flood type, Deep learning, CNN, Classification

How to cite: Wang, Z. and Zhang, C.: Urban Flood Extent Extraction and Type Recognition Based on Multiple Convolutional Neural Networks, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-16528, https://doi.org/10.5194/egusphere-egu24-16528, 2024.