- 1Department of Geodesy and Geoinformation, Technische Universität Wien, Vienna, Austria (alexander.gruber@geo.tuwien.ac.at)
- 2Department of Meteorology, University of Reading, Reading, UK
- 3National Centre for Earth Observation (NCEO), University of Reading, Reading, UK
- 4European Centre for Medium-Range Weather Forecasts (ECMWF), Reading, UK
- 5School of Physics and Astronomy, University of Leicester, Leicester, UK
- 6NCEO, University of Leicester, Leicester, UK
It is well known that scientific data have uncertainties and that it is crucial to take these uncertainties into account in any decision making process. Nevertheless, despite data producer’s best efforts to provide complete and rigorous uncertainty estimates alongside their data, users commonly struggle to make sense of uncertainty information. This is because uncertainties are usually expressed as the statistical spread in the observations (for example, as random error standard deviation), which does not relate to the intended use of the data.
Put simply, data and their uncertainty are usually expressed as something like “x plus/minus y”, which does not answer the really important question: How much can I trust “x”, or any use of or decision based upon “x”? Consequently, uncertainties are often either ignored altogether and the data taken at face value, or interpreted by experts (or non-experts) heuristically to arrive at rather subjective, qualitative judgements of the confidence they can have in the data.
In line with existing practices (e.g., the communication of uncertianties in the IPCC reports), we conjecture that the key to enabling users to make sense of uncertainties is to represent them as the confidence one can have in whatever event one is interested in, given the available data and their uncertainty.
To that end, we propose a novel, generic framework that transforms common uncertaintiy representations (i.e., estimates of stochastic data properties, such as “the state of this variable is “x plus/minus y”) into more meaningful, actionable information that actually relate to their intended use, (i.e., statements such as “the data and their uncertainties suggest that we can be “z” % confident that…”). This is done by first formulating a meaningful question that links the available data to some events of interest, and then deriving quantiative estimates for the confidence in the occurrence of these events using Bayes theorem.
We demonstrate this framework using two case examples: (i) using satellte soil moisture retrievals and their uncertainty to derive how confident one can be in the presence and severity of a drought; and (ii) how ocean temperature analyses and their uncertainty can be used to determine how confident one can be that prevailing conditions are likely to cause coral bleaching.
How to cite: Gruber, A., Bulgin, C., Dorigo, W., Emburry, O., Formanek, M., Merchant, C., Mittaz, J., Muñoz-Sabater, J., Pöppl, F., Povey, A., and Wagner, W.: Making Sense of Uncertainties: Ask the Right Question, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-7755, https://doi.org/10.5194/egusphere-egu26-7755, 2026.