- 1Tel-Hai Academic College, Department of Bioinformatics, Kiryat Shmona, Israel (assafc@migal.org.il)
- 2Agro-Geo-Informatics Laboratory, MIGAL – Galilee Research Institute for Applied Science, Kiryat Shmona, Israel
- 3Plant Pathology Laboratory, Northern R&D, MIGAL – Galilee Research Institute for Applied Science, Kiryat Shmona, Israel
Mango inflorescence malformation, caused by Fusarium mangiferae, represents a major constraint to sustainable mango production worldwide, leading to severe yield losses, reduced fruit set, and long-term reinfection of orchards. Current mitigation strategies rely on labor-intensive sanitation and fungicide applications, which are costly, environmentally burdensome, and often only partially effective and insufficiently timed. There is therefore a critical need for scalable, data-driven tools that enable early, accurate, and spatially explicit detection of disease hotspots within orchards.
In this study, we develop and evaluate an automated detection framework that integrates high-resolution Earth observation data with deep learning to identify malformed mango inflorescences at the canopy and tree level. RGB imagery was collected across multiple seasons (2022–2025) using complementary sensing platforms, including UAVs and ground-based imaging, covering three commercially important cultivars (‘Keitt’, ‘Lilly’, and ‘Kent’) and multiple phenological stages. Semantic segmentation models based on an enhanced U-Net architecture with a ResNet decoder were trained to discriminate healthy and malformed inflorescences at the pixel level, enabling fine-scale disease mapping under heterogeneous field conditions.
Results from the 2025 season demonstrate that millimetric ground-based imagery (0.19–0.68 mm pixel size) enables highly accurate detection of malformation at peak flowering, with average precision exceeding 90% and F1-scores above 0.85 for the disease-sensitive ‘Keitt’ and ‘Lilly’ cultivars. Importantly, incorporating multi-year data and balancing validation datasets significantly improved model robustness and generalization. For the first time, meaningful detection performance from UAV imagery was achieved (up to 71% and 87% precision for malformed and healthy inflorescences, respectively), indicating strong potential for operational orchard-scale monitoring. Cross-cultivar evaluation further revealed partial generalization to ‘Kent’, a cultivar unseen during training, highlighting both the promise and current limits of model transferability.
Beyond detection accuracy, this work delivers key operational insights: disease recognition is highly sensitive to spatial resolution and phenological timing, and segmentation-based approaches provide a strong foundation for precision sanitation, infestation quantification, and decision support. Future work will focus on instance segmentation for whole-inflorescence detection, early-stage disease identification prior to peak bloom, improved cross-cultivar generalization, and integration with UAV- and robot-assisted sanitation workflows. Overall, the study demonstrates how AI-driven Earth observation can support sustainable agroecosystem management by directing sanitation efforts to affected orchard zones, verifying their effectiveness, and enabling disease monitoring during periods of limited field activity, ultimately reducing chemical inputs, labor demands, and pathogen spread.
How to cite: Chen, A., Nagar, Y., and Dafny-Yelin, M.: Data-Driven Detection of Mango Inflorescence Malformation Using Remote Sensing and Deep Learning for Precision Agroecosystem Management, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-20230, https://doi.org/10.5194/egusphere-egu26-20230, 2026.