EGU24-19017, updated on 11 Mar 2024
https://doi.org/10.5194/egusphere-egu24-19017
EGU General Assembly 2024
© Author(s) 2024. This work is distributed under
the Creative Commons Attribution 4.0 License.

Comprehensive Hybrid Approach for LULC classification using GEE-API and ML

Vinay Shivamurthy1, Satyam Pawale2, Yash Chandrashekar Shetty3, and Sudarshan T Bhat3
Vinay Shivamurthy et al.
  • 1Visvesvaraya Technological Univeristy, Alvas Institute of Engineering and Technology, Civil Engineering, Moodbidri, India
  • 2Visvesvaraya Technological Univeristy, Alvas Institute of Engineering and Technology, Artificial Intelligence and Machine Learning, Moodbidri, India
  • 3Visvesvaraya Technological Univeristy, Alvas Institute of Engineering and Technology, Computer Science and Engineering, Moodbidri, India
This research offers a new hybrid strategy for Land Use and Land Cover (LULC) classification that overcomes the constraints of k-Nearest Neighbors (KNN) through the use of Support Vector Machines (SVM). Our study, that makes use of the Google Earth Engine (GEE) API in conjunction with Colab, focuses on custom preprocessing that enhances data richness and context. We perform custom preprocessing, including feature scaling and data fusion of spatial, temporal, spectral, and radiometric dimensions, in order to enhance the data obtained from satellite imagery, incorporating spatial and temporal materials made of composites spectral fusion, and radiometric calibration. The approach we employ uses Rasterio for satellite images and Shapely for vector data. Geopandas encourages the smooth management of data related to geography implementing the GeoJSON format, strengthening compatibility with the Google Earth Engine, whereas Digital Elevation Models (DEMs) and Landgren software enrich LULC analysis.
The hybrid approach eliminates k-Nearest Neighbours (KNN) inefficiencies through the incorporation of Support Vector Machines (SVMs). The drawbacks of KNN, including computational intensity, sensitivity to irrelevant features, susceptibility to noise, and the need for optimal hyperparameter selection, are mitigated by leveraging SVM's strengths. SVM, which has been appreciated for its information technology efficiency, ability to withstand noise and outliers, and relevance-driven decision boundary learning, is a successful complement to KNN. The combination approach encompasses pre-processing with SVM in order to enhance data quality, learning the decision boundary with SVM, and selectively applying KNN in localized regions of interest. The perpetual enhancement of the hybrid model via validation enables a balanced use of SVM's robustness and KNN's flexibility. The proposed hybrid technique is an intriguing option that could enhance the efficiency and performance of LULC classification tasks, catering to the specific characteristics of the dataset and analysis goals.
Libraries for Python (Folium, Matplotlib, Seaborn) enable integration, allowing users to produce distinctive visualizations adapted to the specifications of remote sensing products designed for specific applications. Folium is used for producing interactive geographical maps, Matplotlib delivers configurable static plots, and Seaborn focuses on statistical data visualization. This combination facilitates complete investigation of complicated satellite picture collections using a variety of viewing approaches.
Overall, this hybrid methods, aided by improved preprocessing, data fusion, and visualization tools, presents a promising strategy for improving the efficiency and effectiveness of LULC classification while adapting to particular characteristics of the dataset and able to analyze objectives.
 
Keywords: LULC, Hybrid classification, SVM, KNN, Data fusion, Geospatial analysis, visualization

How to cite: Shivamurthy, V., Pawale, S., Chandrashekar Shetty, Y., and Bhat, S. T.: Comprehensive Hybrid Approach for LULC classification using GEE-API and ML, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-19017, https://doi.org/10.5194/egusphere-egu24-19017, 2024.