EGU25-4898, updated on 14 Mar 2025
https://doi.org/10.5194/egusphere-egu25-4898
EGU General Assembly 2025
© Author(s) 2025. This work is distributed under
the Creative Commons Attribution 4.0 License.
Oral | Thursday, 01 May, 10:45–10:55 (CEST)
 
Room 1.14
Synchronous monitoring of maize phenology stages and leaf area index by integrating Vision Transformer and ResNetV2_34d
Haoze Zhang1, Wenzhi Zeng1, Tao Ma1, Jing Huang1, Yi Liu1, Zhipeng Ren2, and Chang Ao3
Haoze Zhang et al.
  • 1Hohai University, College of Agricultural Science and Engineering, Nanjing, China
  • 2Heilongjiang Academy of Land Reclamation Sciences, Harbing, China
  • 3Wuhan University, School of Water Resources and Hydropower Engineering, Wuhan, China

AbstractCrop growth phenology and leaf area index (LAI) are essential monitoring indicators in precision agriculture, playing a key role in crop management, yield prediction, and assessing responses to environmental changes. Traditional agricultural monitoring methods are constrained by limitations such as low temporal resolution and poor spatial resolution. This study proposes a synchronous monitoring model (SMM) for maize phenological stages and LAI using Unmanned Aerial Vehicle (UAV) imagery, leveraging deep learning techniques to improve inversion accuracy. More exactly, the input variables include multispectral images, thermal infrared images, solar radiation (SRT), evapotranspiration (ETP), and effective accumulated temperature (Tsum). To effectively extract image features, the Vision Transformer (ViT) and ResNetV2_34d models were employed. These deep learning models effectively leverage the spatial information within the images, significantly improving the prediction accuracy of crop phenology (BBCH) and LAI. The results demonstrate that the SMM outperforms traditional methods, achieving substantial improvements in BBCH and LAI inversion accuracy. By integrating deep convolutional neural networks (CNN) with self-attention mechanisms, the ViT captures long-range dependencies in remote sensing images, while ResNetV2_34d enhances the model's ability to extract detailed features. Furthermore, the SMM exhibits superior robustness in spatial information extraction and feature fusion. This study presents an innovative deep learning framework for crop growth monitoring, integrating remote sensing data and climatic factors to facilitate more precise agricultural production management and regulation, thereby contributing to sustainable agricultural development.

 

Key words: Crop growth phenology; Leaf area index; Deep learning; Vision Transformer; Precision agriculture

How to cite: Zhang, H., Zeng, W., Ma, T., Huang, J., Liu, Y., Ren, Z., and Ao, C.: Synchronous monitoring of maize phenology stages and leaf area index by integrating Vision Transformer and ResNetV2_34d, EGU General Assembly 2025, Vienna, Austria, 27 Apr–2 May 2025, EGU25-4898, https://doi.org/10.5194/egusphere-egu25-4898, 2025.