Domain adaptation of deep learning segmentation model for agricultural burn area detection using Hi-Resolution Sentinel-2 observations: A case study in Punjab, India
- 1University of Tokyo, Atmosphere and Ocean Research Institute, Japan (anandanamika.bhu@gmail.com)
- 2University of Tokyo, Atmosphere and Ocean Research Institute, Japan (imasu@aori.u-tokyo.ac.jp)
- 3Nara Women's University, Japan
- 4Japan Agency for Marine-Earth Science and Technology (JAMSTEC), Japan
In India, agricultural burning, also known as crop residue burning, is a significant source of air pollution, but quantifying PM2.5 emissions from these open burns has been a persistent challenge for researchers. Global fire databases actively relies on MODIS (Moderate Resolution Imaging Spectroradiometer) for information on burn area, although MODIS has decades of fire activity data, the coarse spatial resolution (500 m–1 km) along with smog and hazy conditions in Punjab leads to inaccurate detection of fires and burn areas. In our previous work, we developed and implemented a deep learning-based segmentation model combined with Sentinel 2 imagery for accurate fire estimates from open burns and here we discuss our progress on region-specific domain adaptation of the model over Punjab.
Initially, the model was trained on Sentinel-2 data from Portugal, utilizing the ICNF (Portuguese Institute for Nature Conservation and Forests) as reference data during 2016 - 2017, and subsequently tested on Punjab through a transfer learning approach. Now, minimizing false detections by the pre-trained model requires region-specific adaptation; however, inadequate monitoring and lack of reference data in Punjab poses major challenges in the fine-tuning process. Therefore, we utilize ground level geolocated image data collected during our field observation campaign in Punjab along with Sentinel-2 and Google Earth data as an input to the pre-trained model. For the year 2020, over 1400 sites were identified as potential burn area polygons and integrated into vector data, with careful exclusion of aquatic bodies and urban areas to prevent false detections. The refined model processes this annotated data along with Sentinel-2 spectral bands B03 (green), B8A (Near Infrared), and B11 (Short Wave Infrared). Another significant challenge in conventional fire detection methods is identifying smaller burn areas. To address this, the model was deliberately trained with several smaller burned areas (up to 0.0256 hectares) in the sample to enhance the detection of smaller burns that often escape traditional methods. For validation, we intend to utilize the onsite geolocated images taken during the campaign from different time periods. Model’s performance is evaluated using the Dice and IOU score method. Currently, the pre-trained model has an accuracy of 0.62 in Dice and 0.59 in IOU, with aspirations to elevate this accuracy to between 0.85 and 0.90.
As we await the outcomes of our ongoing analysis, we are enthusiastic to see how deep learning combined with high-resolution multispectral imagery of Sentinel - 2 can be used to fill missing gaps in burn area assessments. Our fire estimates and spatial burn area pattern for multiple years on Punjab, will help us quantify the emission contribution from burning at local and regional level and help us understand how the emissions from the field are impacting the air quality in the nearby areas. This enhanced methodology aims to set the stage for creating a high-resolution fire emission inventory specifically for crop residue burning. The findings from this research will contribute significantly to understanding the impact of agricultural burning on air quality and may inform future studies and policy decisions.
How to cite: Anand, A., Imasu, R., Muramatsu, K., and Patra, P.: Domain adaptation of deep learning segmentation model for agricultural burn area detection using Hi-Resolution Sentinel-2 observations: A case study in Punjab, India, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-16351, https://doi.org/10.5194/egusphere-egu24-16351, 2024.