EGU24-19330, updated on 11 Mar 2024
https://doi.org/10.5194/egusphere-egu24-19330
EGU General Assembly 2024
© Author(s) 2024. This work is distributed under
the Creative Commons Attribution 4.0 License.

Burned Area Mapping with Sentinel-2 based on reflectance modelling and deep learning – preliminary global calibration and validation

Marc Padilla1, Ruben Ramo1, Sergio Sierra1, Bernardo Mota2, Roselyne Lacaze3, and Kevin Tansey4
Marc Padilla et al.
  • 1COMPLUTIG, Alcalá de Henares, Spain (marc.padilla@complutig.com)
  • 2National Physical Laboratory, UK
  • 3HYGEOS
  • 4University of Leicester

Current global burned area products are available at coarse spatial resolutions (300-500 m), what leads to large amounts of errors, hindering an accurate understanding of fire-related processes. This study proposes a global calibration method for a sensor-independent burned area algorithm, previously used with 300 m Sentinel-3 Synergy data, and here implemented with 20 m Sentinel-2 MSI imagery. A binomial model that combines reflectance-based burned area predictions constrained by spatio-temporal densities derived from VIIRS active fires is calibrated using a reference dataset generated from Landsat imagery at a sample of 34 units across the globe. Preliminary leave-one-out cross-validation analyses show promisingly high accuracies (Dice of coefficient of 84.8%, commission error ratio of 13.2%, omission error ratio of 17.1% and relative bias of -4.5%), especially taking into account the mismatch of acquisition dates between reference and algorithm input data, what introduces apparent errors on the validation results.

How to cite: Padilla, M., Ramo, R., Sierra, S., Mota, B., Lacaze, R., and Tansey, K.: Burned Area Mapping with Sentinel-2 based on reflectance modelling and deep learning – preliminary global calibration and validation, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-19330, https://doi.org/10.5194/egusphere-egu24-19330, 2024.