EGU26-19863, updated on 14 Mar 2026
https://doi.org/10.5194/egusphere-egu26-19863
EGU General Assembly 2026
© Author(s) 2026. This work is distributed under
the Creative Commons Attribution 4.0 License.
Oral | Thursday, 07 May, 16:40–16:50 (CEST)
 
Room N1
Deep learning-based tree mortality detection using drone imagery and canopy height models in Northern Lapland 
Katalin Waga1, Parvez Rana1, Mikko Kukkonen11, Timo Kumpula2, and Anton Kuzmin2
Katalin Waga et al.
  • 1LUKE - Natural Resources Institute Finland , Finland (katalin.waga@luke.fi)
  • 2University of Eastern Finland, Joensuu, Finland

Dead trees are key indicators of biodiversity and forest health, recognized by both the UN Convention on Biological Diversity and the EU Biodiversity Strategy. Member states are required to monitor these indicators regularly, making accurate tree mortality detection essential. We utilized remote sensing data combined with AI-driven image analysis to improve dead tree detection and forest mortality mapping. In remote regions such as Lapland, where rapid changes caused by snow damage, windthrow, drought, or pest outbreaks occur, traditional inventories are often time consuming and difficult.

Our study area of 208 ha is located next to Pallasjärvi in Northern Lapland, Finland. The area is dominated by Norway Spruce (Picea abies), but Scots Pine (Pinus sylvestris), Silver Birch (Betula pendula) and Downy birch (Betula pubescens) are also present. In the training dataset we recorded 4380 tree segments, including 142 dead trees, and delineated their canopies using visual interpretation on an Altum multispectral drone imagery. We evaluated the integration of LiDAR-derived Canopy Height Models (CHM) with multispectral imagery for classifying living and dead standing trees. The training dataset consisted of 411 image tiles (256x256 pixels) with a pixel size of 5.2 cm, captured from drone in 2023 July using an Altum sensor. The CHM was interpolated with a 1m resolution from the low-pulse density Lidar data that is available via National Land Survey of Finland. The models’ performance was assessed using 10-fold cross-validation.

We applied pixel-level semantic segmentation using U-Net deep learning architecture to classify each pixel of the images into living tree, dead standing tree, and background (e.g. fallen dead trees and non-trees) pixels. The Basic model using only multispectral imagery achieved F1-scores of 0.33–0.44 for dead trees in different areas and up to F1-score of 0.83 for living trees. Incorporating the CHM improved dead tree detection by over 56%, providing F1-scores of 0.56–0.71 and 0.96 for living trees. Visual assessment confirmed that incorporating CHM improved crown delineation by producing more precise crown edges and enhanced the classification of standing deadwood by reducing misclassification of fallen deadwood. The resulting three-class map provides valuable data for qualitative measurements of deadwood, including total land area covered and the percentage of dead crown area.

Our current workflow relies on drone imagery and LiDAR data. However, future scalability through satellite data could enable large-scale, cost-effective monitoring beyond the Arctic region. By incorporating readily available canopy height models (CHM) as an additional input, we enhance tree classification accuracy and improve the detection of standing and fallen deadwood, furthermore, multiclass classification enables more precise tracking of tree mortality than binary classifications done by most studies, as classification could be extended by e.g. dying trees in future. This qualitative measurement supports forest conservation and biodiversity monitoring efforts and could provide a remote sensing-based estimate of dead wood volume to forest inventory.

How to cite: Waga, K., Rana, P., Kukkonen1, M., Kumpula, T., and Kuzmin, A.: Deep learning-based tree mortality detection using drone imagery and canopy height models in Northern Lapland , EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-19863, https://doi.org/10.5194/egusphere-egu26-19863, 2026.