EGU25-7205, updated on 14 Mar 2025
https://doi.org/10.5194/egusphere-egu25-7205
EGU General Assembly 2025
© Author(s) 2025. This work is distributed under
the Creative Commons Attribution 4.0 License.
Oral | Wednesday, 30 Apr, 15:09–15:19 (CEST)
 
Room B
Evaluating uncertainty in probabilistic deep learning models using Information Theory
Manuel Alvarez Chaves1, Hoshin Gupta2, Uwe Ehret3, and Anneli Guthke1
Manuel Alvarez Chaves et al.
  • 1Stuttgart Center for Simulation Science, University of Stuttgart, Stuttgart, Germany
  • 2Hydrology and Atmospheric Sciences, The University of Arizona, Tucson, USA
  • 3Institute of Water Resources and River Basin Management, Karlsruhe Institute of Technology, Karlsruhe, Germany

Deep learning methods in hydrology have traditionally focused on deterministic models, limiting their ability to quantify prediction uncertainty. Recent advances in generative modeling have opened new possibilities for probabilistic modelling in various applied fields, including hydrological forecasting (Jahangir & Quilty, 2024). These models learn to represent underlying probability distributions using neural networks, enabling uncertainty quantification through sampling in a very flexible framework.

In this submission we introduce vLSTM, a variational extension of the traditional long short-term memory (LSTM) architecture that quantifies predictive uncertainty by adding noise sampled from a learned multivariate Gaussian distribution to perturb the model’s hidden state. The vLSTM preserves the traditional LSTM’s state-space dynamics while introducing a probabilistic component that enables uncertainty quantification through sampling. Unlike mixed-density networks (MDNs) which directly model the distribution of the target variable, vLSTM’s uncertainty is obtained by perturbations to the hidden state, providing a novel approach to probabilistic prediction. In rainfall-runoff modeling, vLSTM offers a different mechanism for uncertainty quantification to the well established MDN models (Klotz et al., 2022). This approach enriches the existing toolkit of uncertainty methods in deep learning while maintaining the simplicity of sampling for probabilistic predictions.

To rigorously evaluate probabilistic predictions across different model architectures, we develop new information-theoretic metrics that capture key aspects of how uncertainty is handled by a particular model. These include the average prediction entropy H(X), which quantifies model confidence, and average relative entropy DKL(pq), which measures the average alignment between the predicted distribution of a model and a target, among others. The proposed metrics take advantage of non-parametric estimators for Information Theory which have been implemented in the easy to use UNITE toolbox (https://github.com/manuel-alvarez-chaves/unite_toolbox). By expressing these metrics in compatible units of bits (or nats), we enable direct comparisons between different uncertainty measures. We apply these metrics to our newly introduced vLSTM and the existing MDN models to show strengths and weaknesses of each approach. This information-theoretic framework provides a unified language for analyzing and understanding predictive uncertainty in probabilistic models.

References

  • Jahangir, M. S., & Quilty, J. (2024). Generative deep learning for probabilistic streamflow forecasting: Conditional variational auto-encoder. Journal of Hydrology, 629, 130498. https://doi.org/10.1016/j.jhydrol.2023.130498
  • Klotz, D., Kratzert, F., Gauch, M., Keefe Sampson, A., Brandstetter, J., Klambauer, G., Hochreiter, S., & Nearing, G. (2022). Uncertainty estimation with deep learning for rainfall–runoff modeling. Hydrology and Earth System Sciences, 26(6), 1673–1693. https://doi.org/10.5194/hess-26-1673-2022

How to cite: Alvarez Chaves, M., Gupta, H., Ehret, U., and Guthke, A.: Evaluating uncertainty in probabilistic deep learning models using Information Theory, EGU General Assembly 2025, Vienna, Austria, 27 Apr–2 May 2025, EGU25-7205, https://doi.org/10.5194/egusphere-egu25-7205, 2025.