EGU2020-31
https://doi.org/10.5194/egusphere-egu2020-31
EGU General Assembly 2020
© Author(s) 2020. This work is distributed under
the Creative Commons Attribution 4.0 License.

Synergetic use of Planet data and high-resolution aerial images for windthrow detection based on Deep Learning

Melanie Brandmeier1, Wolfgang Deigele1,2, Zayd Hamdi1,2, and Christoph Straub3
Melanie Brandmeier et al.
  • 1Esri Deutschland GmbH, Science & Education, Kranzberg, Germany (melanie.brandmeier@gmx.de)
  • 2Technical University Munich
  • 3LWF Freising

Due to climate change the number of storms and, thus, forest damage has increased over recent years. The state of the art of damage detection is manual digitization based on aerial images and requires a great amount of work and time. There have been numerous attempts to automatize this process in the past such as change detection based on SAR and optical data or the comparison of Digital Surface Models (DSMs) to detect changes in the mean forest height. By using Convolutional Neural Networks (CNNs) in conjunction with GIS we aim at completely streamlining the detection and mapping process.

We developed and tested different CNNs for rapid windthrow detection based on Planet data that is rapidly available after a storm event, and on airborne data to increase accuracy after this first assessment. The study area is in Bavaria (ca. 165 square km) and data was provided by the agency for forestry (LWF). A U-Net architecture was compared to other approaches using transfer learning (e.g. VGG32) to find the most performant architecture for the task on both datasets.  U-Net was originally developed for medical image segmentation and has proven to be very powerful for other classification tasks.

Preliminary results highlight the potential of Deep Learning algorithms to detect damaged areas with accuracies of over 91% on airborne data and 92% on Planet data. The proposed workflow with complete integration into ArcGIS is well-suited for rapid first assessments after a storm event that allows for better planning of the flight campaign, and first management tasks followed by detailed mapping in a second stage.

How to cite: Brandmeier, M., Deigele, W., Hamdi, Z., and Straub, C.: Synergetic use of Planet data and high-resolution aerial images for windthrow detection based on Deep Learning, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-31, https://doi.org/10.5194/egusphere-egu2020-31, 2019

Display materials

Display file

Comments on the display material

AC: Author Comment | CC: Community Comment | Report abuse

Display material version 2 – uploaded on 27 Apr 2020, no comments
change to pdf
Display material version 1 – uploaded on 15 Apr 2020, no comments