A hybrid framework for improved crop mapping over a large scale by combining pixel-based and object-based approaches
- 1Institute of Geographic Sciences and Natural Resources Research, Chinese Academy of Sciences, Beijing, China (weboffice@igsnrr.ac.cn)
- 2Faculty of Science and Engineering, University of Nottingham Ningbo China, China
- 3Faculty of Engineering, University of Nottingham, United Kingdom
Remote sensing technology presents unique possibilities for monitoring agricultural systems, providing accurate information like crop type distribution, crop planting area, crop rotation, etc. Extracted from remote sensing imagery, previous efforts generally produce crop information based on pixel-based classification strategy without considering spatial context of objects. Further incorporation of object-based image analysis in crop type mapping could improve mapping accuracy and reduce disturbance caused by uncertainties caused by pixel-based methods. Here we aim to combine the advantages of pixel-based and object-based approaches for further improving crop type maps over Northeast China based on Sentinel-2 imagery, simple non-iterative clustering (SNIC), random forest classifier and Google Earth Engine platform. The results showed in the majority of cropland, object-based mapping results had higher accuracies and reduced obvious errors at parcel level. Overall accuracies improved by 0.5% and the Kappa coefficient improved by 9% in Sanjiang Plain. However, soybean and maize intercropping with small parcels could be ignored in object-based methods when clustering objects. Therefore, an integration of pixel and object-based approaches was adopted considering different landscapes and patch areas to generate an unprecedentedly accurate crop type map in Northeast China.
How to cite: Di, Y., Dong, J., Fu, P., and Marsh, S.: A hybrid framework for improved crop mapping over a large scale by combining pixel-based and object-based approaches, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-9951, https://doi.org/10.5194/egusphere-egu24-9951, 2024.