EGU25-16456, updated on 15 Mar 2025
https://doi.org/10.5194/egusphere-egu25-16456
EGU General Assembly 2025
© Author(s) 2025. This work is distributed under
the Creative Commons Attribution 4.0 License.
Poster | Monday, 28 Apr, 10:45–12:30 (CEST), Display time Monday, 28 Apr, 08:30–12:30
 
Hall X1, X1.24
Automatic Vegetation Mapping in Peatlands Using Drone Imagery and Ecologically Informed Machine Learning
Mario Trouillier1, Daniel, L. Pönisch1, Timothy, J. Husting1, Henriette Rossa2, Milan Bergheim1, John Couwenberg2, and Gerald Jurasinski2
Mario Trouillier et al.
  • 1Fraunhofer IGD Rostock, Machine Learning and Vision, Germany, Rostock
  • 2University of Greifswald, partner in the Greifswald Mire Centre, Institute of Botany and Landscape Ecology, Germany, Greifswald

Drained peatlands emit vast amounts of greenhouse gases (GHGs), contributing around 4 % of global GHG emissions. Rewetting peatlands, thus, has the potential to lower global emissions significantly. Restoring drained peatlands to protect the peat body, restore species-rich peat-forming plant communities, and reduce GHG emissions is a long-term process. Monitoring rewetting and restoration of peatlands to quantify (avoided) emissions typically requires either direct measurements via the eddy covariance or closed chamber methods or vegetation maps, as vegetation can be used as a proxy for water-table depths and GHG emissions of peatlands. Both monitoring approaches are expensive and time-consuming, which makes them often unsuitable to monitor large-scale and heterogeneous peatlands over many years. Therefore, new concepts for scalable and efficient monitoring of peatlands are needed.

This research is part of a project that aims to develop an efficient monitoring system for peatlands that scales to hundreds of hectares. Our aim is to identify plant species in (degraded) peatlands using high-resolution drone imagery and a machine learning framework informed by ecological principles. In Northern Germany we acquired RGB (1 cm/px) and multispectral (2 cm/px) imagery using a DJI Mavic 3M quadcopter and processed these data into orthomosaics using WebODM. Additionally, using the point cloud from the photogrammetry process, we derived raster maps of the digital surface model (DSM), standard deviation (a proxy for plant height), and skewness (a proxy for foliage height distribution). In addition to the raster inputs, we used temperature sums (instead of date) and cloud cover percentage as inputs to the model to account for plant phenology and diverse lighting conditions. Our selection of spectral bands, points-cloud derived raster maps and metadata such as temperature sums are ecological informed epistemic priors that aim to increase the model accuracy. Ground truth data (vegetation maps) were generated by mapping the vegetation with an Emlid Reach RS3 Differential Global Positioning System. Since multiple plants can occur together in the same patch, this is a multi-class and multi-label problem from a machine learning perspective. Thus, we used One Hot Encoding to create 3D labels (height × width × species ID).

Our preliminary results show that the accuracy of machine learning models can be improved by providing the models with ecologically informed priors like plant heights and temperature sums, but the ground sampling distance remains the limiting factor for classification accuracy. Next, we will fuse our automatically generated vegetation maps with hydrological information derived from water level measurements to generate high resolution maps of GHG emissions of peatlands.

How to cite: Trouillier, M., Pönisch, D. L., Husting, T. J., Rossa, H., Bergheim, M., Couwenberg, J., and Jurasinski, G.: Automatic Vegetation Mapping in Peatlands Using Drone Imagery and Ecologically Informed Machine Learning, EGU General Assembly 2025, Vienna, Austria, 27 Apr–2 May 2025, EGU25-16456, https://doi.org/10.5194/egusphere-egu25-16456, 2025.