- 1Laboratory of Systems Engineering and Information Technology LISTI, National School of Applied Sciences, Ibnou Zohr University, Agadir, 80000, Morocco.
- 2Applied Geology and Geo-Environment Laboratory, Faculty of Sciences, Ibnou Zohr University, Agadir, 80035, Morocco.
- 3Faculty of Applied Sciences, Ibnou Zohr University, Ait Melloul, Morocco.
- 4Mohammed VI Polytechnic University, International Water Research Institute, Ben Guerir, 43150, Morocco.
- 5Laboratory of Territories, Environment, and Development, Faculty of Human and Social Sciences, Ibn Tofail University, Kenitra, 80035, Morocco.
- 6Mohammed VI Polytechnic University, Citinnov, Rabat, Morocco.
Arid and semi-arid regions are facing more frequent and severe droughts, with annual rainfall often below 200 mm. Large-scale, intensive irrigation further strains these limited water resources. Under these conditions, growers need practical tools to estimate yield and monitor tree health at high spatial detail so they can better manage irrigation and inputs. This work develops and tests an automated, data-driven pipeline for estimating citrus yield at the individual-tree level using UAV imagery and Deep Learning. The pipeline comprises three main components. First, individual trees and orchard rows are segmented using a lightweight Tiny U‑Net model. Second, a CNN-based model predicts tree-level yield from vegetation indices and field measurements. Third, these predictions are validated through detailed fruit sampling.
The study was conducted in a commercial citrus orchard in a semi-arid region under climate and water stress. High‑resolution UAV imagery was processed into orthomosaics and vegetation index maps, and the Tiny U‑Net was optimized for fast, near real‑time semantic segmentation, enabling precise tree crown delineation and accurate tree and row counts. For yield prediction, the CNN model exploited spatial features from vegetation indices combined with in‑situ data. The validation relied on direct comparison between UAV‑based yield estimates and yields obtained from field sampling and laboratory weighing. Both mean and median yields per tree were computed to capture tree‑level variability. The final dataset, consisting of 34 trees and approximately 340 fruit samples, provided a robust basis for assessing model performance. The Tiny U‑Net segmentation model reached high accuracy, with precision and recall of 94.74% and 94.88%, and an inference time of 12.55 ms per image tile. This shows the model is suitable for real‑time or on‑board use and can reliably map orchard structure at large scale. Tree and row counts derived from the segmentation achieved an R² greater than 0.99, confirming the robustness of the approach. For yield estimation, the CNN model outperformed other machine learning methods, achieving an R² of 0.88 at tree level. Field validation confirmed the practical usefulness of the pipeline, UAV‑predicted yields closely matched ground‑truth values, with both indicating an average yield of roughly 50 kg per tree. Most trees fell between 40 and 70 kg, and the model’s output histogram mean 50.9 kg, and median 51.4 kg aligned well with these field observations.
This robust agreement between model outputs and independent field validation data underscores the system's reliability and operational readiness for accurate, tree-level yield mapping. By integrating precise tree segmentation, high-resolution vegetation indices, and rigorously collected ground truth measurements, this study demonstrates that automated yield maps can be produced with sufficient accuracy to support operational decisions in orchards. This offers a cost-effective and scalable tool for precision agriculture, enabling optimized resource allocation, improved harvest planning, and adaptive management under increasing climate stress.
How to cite: Bakas, K., Saddik, A., Dliou, A., Hssaisoune, M., El Hachemy, S., Ait Ichou, H., Hmache, F., El Hafyani, M., Labbaci, A., and Bouchaou, L.: Real-Time UAV-Deep Learning System for Citrus Orchard Structure and Yield Assessment, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-2054, https://doi.org/10.5194/egusphere-egu26-2054, 2026.