EGU General Assembly 2022
© Author(s) 2022. This work is distributed under
the Creative Commons Attribution 4.0 License.

climpred: weather and climate forecast verification in python

Aaron Spring
Aaron Spring
  • Max Planck Institute for Meteorology, Hamburg, Germany (

Predicting subseasonal to seasonal weather and climate yields numerous benefits for economic and environmental decision-making.
Forecasters verify the forecast quality of models by initializing large sets of retrospective forecasts to predict past variations and phenomena in hindcast studies.

Quantifying prediction skill for multi-dimensional geospatial model output is computationally expensive and a difficult coding challenge. The large datasets require parallel and out-of-memory computing to be analyzed efficiently. Further, aligning the many forecast initializations with differing observational products is a straight-forward, but exhausting and error-prone exercise for researchers.

To simplify and standardize forecast verification across scales from hourly weather to decadal climate forecasts, we built climpred: a python package for computationally efficient and methodologically consistent verification of ensemble prediction models. We rely on the python software ecosystem developed by the open pangeo geoscience community. We leverage NetCDF metadata using xarray and out-of-core computation parallelized with dask to scale analyses from a laptop to supercomputer.

With climpred, researchers can assess forecast quality from a large set of metrics (including cprs, rps, rank_histogram, reliability, contingency, bias, rmse, acc, ...) in just a few lines of code:

hind = xr.open_dataset('')
obs = xr.open_dataset('')
he = climpred.HindcastEnsemble(hind).add_observations(obs)
# he = he.remove_bias(how='basic_quantile',
#                                       train_test_split='unfair', 
#                                       alignment='same_verif')
                reference=['persistence', 'climatology'])

This simplified and standardized process frees up resources to tackle the large process-based unknowns in predictability research. Here, we perform a live and interactive multi-model comparison removing bias with different methodologies from NMME project hindcasts and compare against persistence and climatology reference forecasts.



Reference paper: Brady, Riley X. and Aaron Spring (Mar. 2021). “Climpred: Verification of Weather and Climate Forecasts”. en. Journal of Open Source Software 6.59, p. 2781.

How to cite: Spring, A.: climpred: weather and climate forecast verification in python, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-8200,, 2022.

Comments on the display material

to access the discussion