EGU26-2432, updated on 13 Mar 2026
https://doi.org/10.5194/egusphere-egu26-2432
EGU General Assembly 2026
© Author(s) 2026. This work is distributed under
the Creative Commons Attribution 4.0 License.
Oral | Thursday, 07 May, 09:55–10:05 (CEST)
 
Room 2.23
An Integrated Multi-Sensor Framework for National-Scale Crop Mapping and Climate-Resilient Agricultural Monitoring
Tarin Paz-Kagan1, Lior Fine1,3, Adi Edri1,2, Avraham Atanelov1,2, Nechama Z. Brickner1, and Offer Rozenstein3
Tarin Paz-Kagan et al.
  • 1French Associates Institute for Agriculture and Biotechnology of Dryland, The Jacob Blaustein Institutes for Desert Research, Ben-Gurion University of the Negev, Sede Boqer Campus 8499000, Israel
  • 2Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer Sheva, Israel
  • 3Institute of Soil, Water and Environmental Sciences, Agricultural Research Organization, Volcani Institute, HaMaccabim Road 68, Rishon LeZion 75359, Israel

Accurate, timely, and spatially consistent crop maps are a cornerstone of sustainable agricultural management, climate adaptation, and evidence-based policy. Yet national-scale crop mapping remains challenging in heterogeneous agroecosystems due to fragmented field structures, dynamic land use, and the contrasting spatiotemporal characteristics of annual crops and perennial orchards. Addressing these pressures requires scalable, data-driven approaches that translate advances in Earth observation (EO), data science, and modelling into actionable tools for climate-resilient agroecosystem management. Here we present an integrated, end-to-end crop-mapping framework that synthesizes complementary methodological advances to enable robust, operational monitoring of agricultural systems across space, time, and crop types. Using Israel as a national-scale case study representative of heterogeneous, intensively and extensively managed agroecosystems, the framework links fine-scale field structure, crop phenology, and multi-year dynamics to support decision-making under climatic variability. First, national cadastral parcel layers are refined into agronomically homogeneous field units using deep learning-based semantic segmentation (U-Net, DeepLabV3, and SegFormer) and foundation models (SAM), addressing a critical limitation of registry-based agricultural databases. A U-Net architecture outperformed SegFormer and DeepLabV3, achieving a mean Intersection-over-Union (IoU) of 0.76 with balanced precision-recall. At the national scale, polygon correctness improved from 75.16% to 86.37%, resulting in tens of thousands of fields segmented into homogeneous management units. This step substantially improves geometric consistency and the reliability of downstream crop classification and agroecosystem analysis. Second, a hierarchical, multi-sensor classification strategy integrates Sentinel-1 SAR and Sentinel-2 multispectral time series with phenological metrics and expert-driven feature selection to map agricultural land use and dynamically classify annual field crops across multiple growing seasons. XGBoost achieved the highest land-cover accuracy (OA = 0.909), driven primarily by vegetation, moisture, and chlorophyll-sensitive indices (NDVI, MCARI, NDMI, PGHI). For detailed row-crop mapping, deep learning models outperformed traditional machine learning (TabM OA = 0.861). Multi-satellite fusion ensured robustness and transferability, yielding an average leave-one-year-out accuracy of 0.833. This integration captures crop rotations, seasonal shifts, and climate-driven phenological gradients, enabling consistent multi-year monitoring in dryland and Mediterranean environments. Third, perennial orchard systems, often underrepresented in national crop statistics, are mapped using a multimodal fusion approach that combines very-high-resolution (VHR) aerial imagery with multi-temporal Sentinel-1/2 data. Deep learning architectures jointly exploit fine-scale spatial structure and phenological dynamics, achieving the highest performance across all evaluation settings (same-year OA = 0.890 ± 0.009; cross-year OA = 0.881 ± 0.014), with particularly strong gains for early-stage and sparsely vegetated orchards. Overall, the framework is designed for scalability, interpretability, and operational deployment, demonstrating how multi-modal remote sensing, deep learning, and hierarchical modelling can bridge scientific innovation and real-world agricultural decision-making under climate change.

How to cite: Paz-Kagan, T., Fine, L., Edri, A., Atanelov, A., Brickner, N. Z., and Rozenstein, O.: An Integrated Multi-Sensor Framework for National-Scale Crop Mapping and Climate-Resilient Agricultural Monitoring, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-2432, https://doi.org/10.5194/egusphere-egu26-2432, 2026.