EGU26-18986, updated on 14 Mar 2026
https://doi.org/10.5194/egusphere-egu26-18986
EGU General Assembly 2026
© Author(s) 2026. This work is distributed under
the Creative Commons Attribution 4.0 License.
Poster | Tuesday, 05 May, 14:00–15:45 (CEST), Display time Tuesday, 05 May, 14:00–18:00
 
Hall X5, X5.141
Beyond the Training Data: Confidence-Guided Mixing of Parameterizations in a Hybrid AI-Climate Model
Helge Heuer1, Tom Beucler2,3, Mierk Schwabe1, Julien Savre1, Manuel Schlund1, and Veronika Eyring1,4
Helge Heuer et al.
  • 1German Aerospace Center (DLR), Institute for Atmospheric Physics, Weßling, Germany (helge.heuer@dlr.de)
  • 2Faculty of Geosciences and Environment, University of Lausanne, Lausanne, Switzerland
  • 3Expertise Center for Climate Extremes, University of Lausanne, Lausanne, Switzerland
  • 4University of Bremen, Institute of Environmental Physics (IUP), Bremen, Germany

Persistent systematic errors in Earth system models (ESMs) arise from difficulties in representing the full diversity of subgrid, multiscale atmospheric convection and turbulence. Machine learning (ML) parameterizations trained on short high-resolution simulations show strong potential to reduce these errors. However, stable long-term atmospheric simulations with hybrid (physics + ML) ESMs remain difficult, as neural networks (NNs) trained offline often destabilize online runs. Training convection parameterizations directly on coarse-grained data is challenging, notably because scales cannot be cleanly separated. This issue is mitigated using data from superparameterized simulations, which provide clearer scale separation. Yet, transferring a parameterization from one ESM to another remains difficult due to distribution shifts that induce large inference errors. Here, we present a proof-of-concept where a ClimSim-trained, physics-informed NN convection parameterization is successfully transferred to ICON-A. The scheme is (a) trained on adjusted ClimSim data with subtracted radiative tendencies, and (b) integrated into ICON-A. The NN parameterization predicts its own error, enabling mixing with a conventional convection scheme when confidence is low, thus making the hybrid AI-physics model tunable with respect to observations and reanalysis through mixing parameters. This improves process understanding by constraining convective tendencies across column water vapor, lower-tropospheric stability, and geographical conditions, yielding interpretable regime behavior. In AMIP-style setups, several hybrid configurations outperform the default convection scheme (e.g., improved precipitation statistics). With additive input noise during training, both hybrid and pure-ML schemes lead to stable simulations and remain physically consistent for at least 20 years, demonstrating inter-ESM transferability and advancing long-term integrability.

How to cite: Heuer, H., Beucler, T., Schwabe, M., Savre, J., Schlund, M., and Eyring, V.: Beyond the Training Data: Confidence-Guided Mixing of Parameterizations in a Hybrid AI-Climate Model, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-18986, https://doi.org/10.5194/egusphere-egu26-18986, 2026.