- 1KU Eichstätt Ingolstadt, Mathematical Institute for Machine Learning and Data Science, Faculty for Mathematics and Geography, Ingolstadt, Germany (tijana.janjic@ku.de)
- 2Institute of Informatics, LMU Munich
Quantifying the evolution of uncertainty is critical to both probabilistic forecasting and data assimilation (DA) in numerical weather prediction (NWP). In this study, we investigate the applicability of conformal prediction (CP), a recent machine learning (ML) method, to quantify uncertainty in a controlled, idealized setting. We use a toy model, designed to mimic convective process. The CP provides a set of possible outcomes with a chosen confidence level. Here, we compare and evaluate the average empirical coverage, the average interval length and average interval score loss (AISL) for the three variants of CP i.e., a) Standard CP, b) Normalized CP and c) Conformalized Quantile Regression. We also combine DA with the CP estimates of uncertainty and quantify the significance of improvement. Our results highlight the strengths and limitations of each approach, providing insights into the effectiveness of CP to complement ensemble-based uncertainty quantification in simplified atmospheric models.
How to cite: George, C., Janjic, T., Javanmardi, A., and Hüllermeier, E.: Uncertainty quantification via conformal prediction in data assimilation, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-7371, https://doi.org/10.5194/egusphere-egu26-7371, 2026.