- 1Chair for Sensor-based Geoinformatics (geosense), University of Freiburg, Freiburg, Germany
- 2Institute for Earth System Science and Remote Sensing, Leipzig University, Germany
- 3Department of Geosciences and Natural Resource Management, University of Copenhagen, Denmark
- 4Institute for Forest Protection, Julius Kühn Institute (JKI) - Federal Research Centre for Cultivated Plants, Germany
- 5School of Forest Sciences, University of Eastern Finland, Finland
- 6Department of Environmental Systems Sciences, ETH Zurich, Switzerland
Tree mortality rates are rising across many regions of the world. These are driven by the complex interplay of abiotic and biotic factors, including global warming, climate extremes, pests, pathogens, and other environmental stressors. Despite the urgency of understanding these dynamics, critical gaps remain in our ability to determine where trees are dying, why they are dying, and to predict future mortality hotspots. These knowledge gaps are primarily caused by missing data on tree mortality events. Ground-based observations, such as national forest inventories, are often sparse, inconsistent, and lack the spatial precision needed for comprehensive analysis. By contrast, satellite observations have a high temporal resolution, but their spatial resolution is often too coarse to identify individual trees. Earth observations combining drones and satellites, using machine learning, offer a promising avenue for mapping standing dead trees and uncovering the underlying drivers of tree mortality.
Here we introduce deadtrees.earth, an initiative focusing on multi-scale remote sensing of tree mortality across scales. At its core, deadtrees.earth curates the largest archive of centimeter-scale RGB aerial imagery of forests, with over 2,000 orthoimages representing diverse forest biomes across continents and major forest types. Using extensive annotations of dead canopies, we develop computer vision models capable of automated semantic and instance segmentation of dead tree canopies in RGB orthoimages. These model variants demonstrate robustness across varying resolutions, biomes, and forest types, and can be applied to any orthoimage imagery submitted to the platform, enabling users to exploit these tools for their analysis.
These local-scale predictions derived from drone and airplane imagery form the foundation for training satellite-based AI models to monitor tree mortality and forest cover on a global scale. We showcase recent advancements in spatiotemporal transformer models utilizing Sentinel-1 and Sentinel-2 data to produce global-scale, annual maps of forest cover and standing deadwood fractions at 10-meter resolution.
A recent key functionality of the deadtrees.earth platform is its web-based annotation tools, which allow users to contribute additional training data or provide feedback on existing predictions. This crowdsourcing functionality promotes community engagement, facilitating continuous improvement and fostering trust in the provided aerial image-based and satellite-based models and products.
Future work will also include expanding the coverage of aerial imagery, particularly in underrepresented regions such as Asia and Africa, which remains a cardinal priority to ensure the inclusivity and representativeness of the platform’s global-scale analyses. Moreover, we aim to apply the data products in a range of use-cases, ranging from attribution and forecasting of mortality to calibrating mortality in dynamic vegetation models. By bridging local and global scales, this work offers a critical tool for monitoring forest mortality trends, contributing to climate change impact assessments, and enhancing predictive capabilities for ecosystem resilience.
How to cite: Kattenborn, T., Mosig, C., Vajna-Jehle, J., Cheng, Y., Hartmann, H., Montero, D., Juntilla, S., Horion, S., Beloiu Schwenke, M., and Mahecha, M.: deadtrees.earth: tree mortality monitoring from local to global scales with AI and remote sensing, EGU General Assembly 2025, Vienna, Austria, 27 Apr–2 May 2025, EGU25-19718, https://doi.org/10.5194/egusphere-egu25-19718, 2025.