- 1Ghent University, Faculty of bioscience engineering, Plants and Crops, Belgium (ambroos.vanpoucke@ugent.be, wouter.maes@ugent.be)
- 2Ghent University, Faculty of bioscience engineering, Data Analysis and Mathematical Modelling, Belgium (jan.verwaeren@ugent.be)
Advancements in sensing technology and in machine and deep learning have expanded UAV remote sensing applications in agriculture. Most of these applications rely on supervised techniques, but generalization remains a critical and underexplored challenge. Agricultural datasets often exhibit variability across fields, sensors, crops and growth stages. While models such as convolutional neural networks (CNNs) perform well when trained on millions of samples, this approach is impractical with UAV-based agricultural data. This suggests that a location-specific, unsupervised approach might be more effective.
This study proposes a generally applicable method to map weed densities in row crops using high resolution RGB UAV data. The workflow first starts with a vegetation masking based on the Excess Green index, followed by a novel row detection model that separates intra- and interrow vegetation. Pseudo-labels generated from this step are used to train the CNN segmentation model Deeplabv3.
The method was applied on 12 maize datasets collected across multiple locations in Belgium, at different growth stages, and using three different UAV cameras, leading to ranges in ground sampling distance (GSD). The model was also applied on a public sugar beet dataset, PhenoBench, covering 3 dates was used to validate the model. Model performance was evaluated against manually annotated ground truth segmentation maps from each field (n = 50).
Semantic segmentation of crops achieved consistent mean Intersection over Union (IoU) values, exceeding 0.7 (F1-score > 0.89). Weed detection performance was relatively low in very early growth stages (IoU>0.4, F1-score > 0.6) due to limited plant sizes, but improved as weeds grew, with IoU reaching 0.63 (F1-score = 0.83) in later stages. The model was equally performant on maize and on sugar beet.
Despite these early-season limitations, the lower weed detection accuracy had minimal impact on field-level weed density maps, which are primarily used for relative density comparisons to guide site-specific herbicide applications. Regression analyses of predicted crop and weed areas against ground truth annotations showed strong linear relationships. Early-season datasets exhibited slight underestimates of weed area, whereas later-season datasets demonstrated a near-perfect 1:1 relationship (R² > 0.80). GSD proved to be a reciprocal indicator of accuracy, with the highest accuracy at GSDs below 1mm/pixel. GSD above 3 mm/pixel showed a rapid decrease in accuracy.
Overall, the proposed approach effectively generates accurate field-level weed density maps, offering a robust tool for precision weed management in agriculture.
How to cite: Van Poucke, A., Verwaeren, J., and Maes, W.: Generally applicable method for unsupervised weed detection in row crops using UAV-based high-resolution RGB imagery, EGU General Assembly 2025, Vienna, Austria, 27 Apr–2 May 2025, EGU25-15760, https://doi.org/10.5194/egusphere-egu25-15760, 2025.