EGU26-5216, updated on 13 Mar 2026
https://doi.org/10.5194/egusphere-egu26-5216
EGU General Assembly 2026
© Author(s) 2026. This work is distributed under
the Creative Commons Attribution 4.0 License.
Oral | Thursday, 07 May, 10:05–10:15 (CEST)
 
Room 2.23
MS²-Net: A deep learning framework for high-throughput assessment of wheat emergence-stage plant density using multi-altitude multispectral UAV imagery
Mengshuai Wang1, Linjia Yao2, Bin Chen1, Zhiming Xia1, Bo Pang1, Zhijian He1, Yingnan Wei1, Genghong Wu1, Qiang Yu1, and Gang Zhao1
Mengshuai Wang et al.
  • 1State Key Laboratory of Soil and Water Conservation and Desertification Control, College of Soil and Water Conservation Science and Engineering, Northwest A&F University, Yangling, Shaanxi 712100, China
  • 2College of Natural Resources and Environment, Northwest A&F University, Yangling, Shaanxi 712100, China

Plant density at the wheat emergence stage is a fundamental structural attribute of agroecosystems, exerting strong control on early competition, resource use efficiency, and yield formation. While UAV-based counting approaches have been widely explored for visually distinct crops such as maize and cotton, accurate and scalable estimation of wheat seedlings remains challenging due to their small size, high spatial density, and spectral similarity to soil and residue backgrounds. Moreover, existing RGB-based UAV and ground imaging approaches face an inherent trade-off between spatial resolution, spectral sensitivity, and operational efficiency.

Here, we propose MS²‑Net (Multi-altitude, Multispectral Seedling Network), a high-throughput Earth-observation framework that integrates multi-altitude multispectral UAV observations with deep learning to enable robust estimation of wheat plant density at the emergence stage. Field experiments were conducted across three major wheat-growing regions in China (Henan, Hebei, and Shaanxi), covering approximately 1,500 plots spanning large variability in sowing density, genotype, and early growth conditions. Multispectral UAV imagery (blue, green, red, red-edge, and near-infrared) was acquired at four flight altitudes (12, 15, 20, and 40 m), enabling systematic evaluation of the trade-off between spatial detail and mapping efficiency. High-resolution smartphone images collected synchronously at plot level provided accurate reference plant counts for model training and validation.

All UAV data were radiometrically calibrated to surface reflectance and used to derive conventional vegetation indices (NDVI, GNDVI, NDRE, OSAVI, and a red-edge chlorophyll index) for spectral interpretability. Wheat plant density was estimated using a deep regression framework built on an EfficientNet-B6 backbone and enhanced with spectral-aware adaptation, spatial attention, and scale-consistent feature learning, allowing MS²-Net to exploit both multispectral information and multi-scale spatial patterns. Across five-fold cross-validation over regions and flight altitudes, MS²-Net achieved robust density estimation (R² = 0.86, RMSE = 37.20 plants m⁻², averaged across sites and flight altitudes), with red-edge and near-infrared bands contributing substantially to model stability across observation scales.

Results demonstrate that multi-altitude multispectral UAV observations provide a practical balance between spatial resolution, spectral sensitivity, and survey efficiency, outperforming both ground-based imaging and RGB-only UAV approaches for early wheat stand assessment. By enabling rapid, field-scale and spectrally informed plant density mapping, MS²-Net provides a scalable pathway for operational agroecosystem monitoring, high-throughput phenotyping, and precision crop management under real field conditions.

How to cite: Wang, M., Yao, L., Chen, B., Xia, Z., Pang, B., He, Z., Wei, Y., Wu, G., Yu, Q., and Zhao, G.: MS²-Net: A deep learning framework for high-throughput assessment of wheat emergence-stage plant density using multi-altitude multispectral UAV imagery, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-5216, https://doi.org/10.5194/egusphere-egu26-5216, 2026.