EGU26-7860, updated on 14 Mar 2026
https://doi.org/10.5194/egusphere-egu26-7860
EGU General Assembly 2026
© Author(s) 2026. This work is distributed under
the Creative Commons Attribution 4.0 License.
Poster | Friday, 08 May, 14:00–15:45 (CEST), Display time Friday, 08 May, 14:00–18:00
 
Hall X4, X4.135
Densification and forecasting of Sentinel-2 time series from multimodal SAR and optical satellite data using deep generative models
Véronique Defonte1, Dawa Derksen2, Alexandre Constantin2, and Bastien Nespoulous1
Véronique Defonte et al.
  • 1Thales Service Numeriques, Toulouse, France (veronique.defonte@thalesgroup.com)
  • 2Centre National d'Etudes Spatiales, Toulouse, France

Sentinel-2 optical image time series are a key source of information for many Earth observation applications, including climate monitoring, agriculture, ecosystem dynamics, and land surface change analysis. Dense and regular observations are essential to accurately capture seasonal patterns, abrupt events, and long-term trends. However, in practice, Sentinel-2 time series are often sparse and irregular due to cloud cover and varying acquisition conditions. These limitations significantly complicate continuous monitoring and the analysis of surface dynamics. Moreover, beyond time series densification, there is a growing need to anticipate future optical observations to support scenario analysis, early warning systems, and predictive environmental monitoring.

To address these challenges, we propose a deep learning–based framework for densifying Sentinel-2 time series by generating plausible optical images at arbitrary past or future dates. The approach relies on multimodal satellite observations, jointly exploiting optical Sentinel-2 and radar Sentinel-1 data. Indeed, SAR measurements are insensitive to cloud cover and provide complementary structural and temporal information. This multimodal setting enables the reconstruction of missing observations and the prediction of future optical states while preserving realistic spatio-temporal dynamics.

From a methodological perspective, the model is explicitly designed to handle sparse, incomplete, and temporally misaligned multimodal time series. It operates on temporal sets of Sentinel-2 and Sentinel-1 images acquired at irregular dates around a target time. A cross-attention mechanism is used to explicitly model interactions across time and modalities, allowing the network to identify and weight the most relevant observations for generating a Sentinel-2 image at a given target date.

In addition, the proposed framework incorporates a probabilistic decoder that estimates not only the predicted Sentinel-2 image but also an associated uncertainty map. This uncertainty estimation provides valuable insight into the confidence of the generated pixels, which is particularly important for downstream applications such as anomaly detection, risk assessment, and decision-making support.

The model is evaluated across multiple geographical regions and land-cover types, demonstrating strong performance in both densification and forecasting tasks. Results show that the proposed approach successfully preserves the temporal dynamics of the scenes, notably by accurately reproducing vegetation phenology as reflected in NDVI time series. Forecasting experiments further highlight the importance of radar information: Sentinel-1 observations close to the target date allow the model to detect surface changes occurring after the last available optical image, thereby improving future predictions. Overall, the proposed method represents a step towards the densification and forecasting of Sentinel-2 time series, offering a promising direction for future methodologies aimed at continuous Earth surface monitoring and predictive analysis.

How to cite: Defonte, V., Derksen, D., Constantin, A., and Nespoulous, B.: Densification and forecasting of Sentinel-2 time series from multimodal SAR and optical satellite data using deep generative models, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-7860, https://doi.org/10.5194/egusphere-egu26-7860, 2026.