EGU26-13146, updated on 14 Mar 2026
https://doi.org/10.5194/egusphere-egu26-13146
EGU General Assembly 2026
© Author(s) 2026. This work is distributed under
the Creative Commons Attribution 4.0 License.
Oral | Tuesday, 05 May, 14:00–14:10 (CEST)
 
Room N1
Global near-real time burned area mapping with Sentinel-2 based on reflectance modelling and deep learning
Marc Padilla1, Ruben Ramo2, Jose Luis Gomez-Dans3, Sergio Sierra4, Bernardo Mota5, Roselyne Lacaze6, and Kevin Tansey7
Marc Padilla et al.
  • 1COMPLUTIG, Alcalá de Henares, Spain (marc.padilla@complutig.com)
  • 2Indra Espacio, Alcobendas, Spain (rramo@indra.es)
  • 3King's College London, London, UK (jose.gomez-dans@kcl.ac.uk)
  • 4COMPLUTIG, Alcalá de Henares, Spain & Universidad de Cantabria, Santander, Spain (sergio.sierra@unican.es)
  • 5National Physical Laboratory, Teddington, UK (bernardo.mota@npl.co.uk)
  • 6HYGEOS, Lille, France (rl@hygeos.com)
  • 7University of Leicester, Leicester, UK (kjt7@leicester.ac.uk)

Global burned area (BA) products are commonly available at a Non-Time Critical (NTC) basis, several months or even several years from the present date; i.e. they are unavailable for Near-Real Time (NRT) applications. The Copernicus Land Monitoring Service (CLMS) delivers the only global BA product in NRT, since recently, at very high accuracy, comparable to the most accurate non-CLMS NTC product (FIRECCIS311). However, global BA products are generated from coarse >= 300 m reflectance observations. Despite the Sentinel-2 mission having been in operation since 2017, providing decadal resolution 10-50 m reflectance data every ~5 days, and despite the well-known benefits of using decadal resolution data to estimate BA, a global Sentinel-2 NRT BA algorithm does not exist. The purpose of this study is to adapt and apply the latest developments in NRT detection, as implemented in the CLMS, to Sentinel-2 L2A imagery. The mapping method uses a neural network (NN) with 2D convolutional layers, followed by a Long Short-Term Memory (LSTM) layer. The NN processes the time series of reflectance images on a per-pixel basis, with convolutional layers applied along the spectral and temporal dimensions. The time series of fractional BA maps, predicted by the NN, are combined with time series of spatio-temporal density of VIIRS active fire detections. Such a combination consists of a logistic model and allows the reduction of false positives (such as cloud shadows). The NN is trained on a sample dataset automatically generated from time series reflectance observations (Sentinel-2 data in this case), extracted over locations of VIIRS active fire detections across the Globe for the year 2020, and corresponding estimates of fractional BA, derived from physically-based radiative transfer modelling. The mapping method generates one BA map for each new Sentinel-2 image available (referred to as BAS2nrt0), which is updated with images from the following 5 days (referred to as BAS2nrt5) and the following 10 days (referred to as BAS2nrt10). The additional images available after the mapping day are expected to reduce false positives due to cloud shadows. The mapping method also generates an NTC BA map for each calendar month (referred to as BAS2ntc), with images available for a buffer of 45 days around the month. The algorithm results are validated against an independent global reference dataset for the year 2019, which includes long time series of Landsat-derived BA maps covering 105 sampling units distributed across the Globe. The analysis of the 2019 validation results shows that the accuracy of the proposed Sentinel-2 products is high regardless of estimation timeliness. As expected, (1) the accuracy of the NTC product, Dice coefficient (DC) of 87.2%, is higher than the NRT products, DC 82.7–85.4%, and (2) the accuracy of the NRT product is increased with each update. Such accuracy levels are remarkably high: the accuracy of NRT estimates is comparable to a precedent global non-CLMS NTC Sentinel-2 BA mapping (DC 81.8%).

How to cite: Padilla, M., Ramo, R., Gomez-Dans, J. L., Sierra, S., Mota, B., Lacaze, R., and Tansey, K.: Global near-real time burned area mapping with Sentinel-2 based on reflectance modelling and deep learning, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-13146, https://doi.org/10.5194/egusphere-egu26-13146, 2026.