EGU2020-17917
https://doi.org/10.5194/egusphere-egu2020-17917
EGU General Assembly 2020
© Author(s) 2020. This work is distributed under
the Creative Commons Attribution 4.0 License.

Insect Damaged Tree Detection with Drone Data and Deep Learning Technique, Case Study: Abies Mariesii Forest, Zao Mountain, Japan

Nguyen Ha Trang1, Yago Diez2, and Larry Lopez1
Nguyen Ha Trang et al.
  • 1Graduate School of Agricultural Science, Yamagata University, Tsuruoka, Japan
  • 2Faculty of Science, Yamagata University, Yamagata, Japan

The outbreak of fir bark beetles (Polygraphus proximus Blandford) in natural Abies Mariesii forest on Zao Mountain were reported in 2016. With the recent development of deep learning and drones, it is possible to automatically detect trees in both man-made and natural forests including damaged tree detection. However there are still some challenges in using deep learning and drones for sick tree detection in mountainous area that we want to address: (i) mixed forest structure with overlapping canopies, (ii) heterogeneous distribution of species in different sites, (iii) high slope of mountainous area and (iv) variation of mountainous climate condition. The current work can be summarized into three stages: data collection, data preparation and data processing. All the data were collected by DJI Mavic 2 pro at 60-70m flying height from the take off point with ground sampling distance (GSD) are ranging from1.23 cm to 2.54 cm depending on the slope of the sites. To prepare the data to be processed using a Convolutional Neural Network (CNN), all images were stitched together using Agisoft’s metashape software to create five orthomosaics of five study sites. Every site has different percentage of fir according to the change of elevation. We then manually annotated all the mosaics with GIMP to categorize all the forest cover into 6 classes: dead fir, sick fir, healthy fir, deciduous trees, grass and uncovered (pathway, building and soil). The mosaics are automatically divided into small patches with the assigned categories by our algorithm with first trial window size of 200 pixel x 200 pixel, which we temporally see can cover the medium fir trees. We will also try different window sizes and evaluate how this parameter affects results. The resulting patches were finally used as the input for CNN architecture to detect the damaged trees. The work is still on going and we expect to achieve the results with high classification accuracy in terms of deep learning algorithm allowing us to build maps regarding health status of all fir trees.

 

Keywords: Deep learning, CNN, drones, UAVs, tree detection, sick trees, insect damaged trees, forest

 

How to cite: Ha Trang, N., Diez, Y., and Lopez, L.: Insect Damaged Tree Detection with Drone Data and Deep Learning Technique, Case Study: Abies Mariesii Forest, Zao Mountain, Japan, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-17917, https://doi.org/10.5194/egusphere-egu2020-17917, 2020