Generating multi-temporal landslide inventories through a general deep transfer learning strategy using HR EO data
- 1University of Padova, Geosciences, Padova, Italy (kushanav.bhuyan@studenti.unipd.it)
- 2Centre for Disaster Resilience, Department of Applied Earth Sciences, Faculty of Geo-Information Science and Earth Observation (ITC), University of Twente, Enschede, The Netherlands
Mapping landslides in space has gained a lot of attention over the past decade with good results. Current methods are primarily used to generate event inventories, but multi-temporal (MT) inventories are rare, even with manual landslide mapping. Here, we present an innovative deep learning strategy employing transfer learning. This allows our Attention Deep Supervision multi-scale U-Net model to be adapted to landslide detection tasks in new regions. This method also provides the flexibility to retrain a pretrained model to detect both rain and seismic landslides in new regions of interest. For mapping, archived Planet Lab remote sensing imagery from 2009 to 2021 at spatial resolutions of 3–5 m was used to systematically generate MT landslide inventories. Examining all cases, our approach provided an average F1 value of 0.8, indicating that it successfully identified the spatiotemporal occurrence of landslides. To examine the size distribution of mapped landslides, we compared the frequency distribution of predicted co-seismic landslides with manually mapped products from the literature. The results showed good agreement between the calculated exponents of the power law, with differences ranging from 0.04 to 0.21. Overall, this study demonstrated that the proposed algorithm can be applied to large areas to construct a polygon-based MT landslide inventory.
How to cite: Bhuyan, K., Tanyas, H., Nava, L., Puliero, S., Meena, S. R., Floris, M., Westen, C. V., and Catani, F.: Generating multi-temporal landslide inventories through a general deep transfer learning strategy using HR EO data, EGU General Assembly 2023, Vienna, Austria, 24–28 Apr 2023, EGU23-6718, https://doi.org/10.5194/egusphere-egu23-6718, 2023.