EGU25-16843, updated on 15 Mar 2025
https://doi.org/10.5194/egusphere-egu25-16843
EGU General Assembly 2025
© Author(s) 2025. This work is distributed under
the Creative Commons Attribution 4.0 License.
Oral | Wednesday, 30 Apr, 09:45–09:55 (CEST)
 
Room -2.21
Explainable Machine Learning for Forest Fire Detection with Remote Sensing for Effective Rescue Planning
Octavian Dumitru, Chandrabali Karmakar, and Shivam Goyal
Octavian Dumitru et al.
  • German Aerospace Center, Remote Sensing Technology Institute, EO Data Science Department, Germany

In the present decade, forest fires have become more common than ever [1]. Efficient strategies to cope with fire situations, and/damage assessments need efficient automatic forest fire detection model. In this research, we propose an unsupervised eXplainable machine learning model to assess the severity of forest fire with remote sensing data. The model, namely, Latent Dirichlet Allocation is a Bayesian Generative model, is capable of generating interpretable visualizations. LDA uncertainty quantifiable and explainable [2]. We do not need labelled data to train the model. Other usefulness of the model is that it is simple to combine any kind of input data (for example, UAV images, wind speed information). In the scope of this contribution, we use Sentinel-2 spectral bands to extract information to compute indices indicating severity of fire [1]. Uncertainty of each prediction of the model is computed to ascertain robustness of the model. As a use case, we have chosen the recent forest fire incident at Los Angeles, USA [6].

The methodological approach is as the following:

1) we acquire pre-fire, post-fire Seintinel-2 images, 2) compute three indices : Normalized Difference Vegetation Index (NDVI), Normalized Burn Ratio (NBR), and Burned Area Index for Sentinel (BAIS) based on state of the art literature and generate index maps, 3) compute difference between the pre-fire and post-fire index maps, 4) apply the unsupervised xAI LDA model to retrieve semantic classes in pre-fire and post-fire Sentinel-2 band images, general corresponding classification maps and plot a binary class-to-class change map,  5) Analyze the maps with visual tool to find the most affected semantic classes (e.g., dense vegetations, urban areas etc.) and produce a data-driven estimation of per-class changes due to fire [7].

In future, we plan to fuse other data sources (e.g., wind speed information [5]) to help practical applications.

Reference:  

[1] Lasaponara, A. M. Proto, A. Aromando, G. Cardettini, V. Varela and M. Danese, "On the Mapping of Burned Areas and Burn Severity Using Self Organizing Map and Sentinel-2 Data," in IEEE Geoscience and Remote Sensing Letters, vol. 17, no. 5, pp. 854-858, May 2020, doi: 10.1109/LGRS.2019.2934503.

[2] Karmakar, C. O. Dumitru, G. Schwarz and M. Datcu, "Feature-Free Explainable Data Mining in SAR Images Using Latent Dirichlet Allocation," in IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, vol. 14, pp. 676-689, 2021, doi: 10.1109/JSTARS.2020.3039012.

[3] California Wildfires Live Updates: 24 Dead in L.A. as Dangerous Winds Threaten Fire Growth - The New York Times

[4] Sentinel-2 mission. Available online: https://sentinel.esa.int/web/sentinel/missions/sentinel-2

[5] Global Wind Atlas. Available online: https://globalwindatlas.info/en/about/dataset

[6] ESA news based on Sentinel-2. Available online: https://www.esa.int/ESA_Multimedia/Missions/Sentinel-2/(offset)/100/(sortBy)/published/(result_type)/images

[7] Karmakar, C.O. Dumitru, N. Hughes and M. Datcu, "A Visualization Framework for Unsupervised Analysis of Latent Structures in SAR Image Time Series", IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 16, pp. 5355-5373, 2023.

How to cite: Dumitru, O., Karmakar, C., and Goyal, S.: Explainable Machine Learning for Forest Fire Detection with Remote Sensing for Effective Rescue Planning, EGU General Assembly 2025, Vienna, Austria, 27 Apr–2 May 2025, EGU25-16843, https://doi.org/10.5194/egusphere-egu25-16843, 2025.