EGU2020-11881
https://doi.org/10.5194/egusphere-egu2020-11881
EGU General Assembly 2020
© Author(s) 2020. This work is distributed under
the Creative Commons Attribution 4.0 License.

Centennial and millennial variability in models and data

Gerrit Lohmann, Martin Butzin, Nina Eissner, Xiaoxu Shi, and Christian Stepanek
Gerrit Lohmann et al.
  • Alfred Wegener Institute, Helmholtz Center for Polar & Marine Research, Climate System, Bremerhaven, Germany (gerrit.lohmann@awi.de)

The Earth’s climate is characterized by many modes of variability. On millennial timescales, decaying Northern Hemisphere ice sheets during the last deglaciation affect the high latitude hydrological balance in the North Atlantic and therefore the ocean circulation after the Last Glacial Maximum. Global sea-level reconstructions indicate marked abrupt changes within several hundred years. Using a multi-scale climate model with a high resolution near the coast, we find a strong sensitivity of the ocean circulation depending on where the deglacial meltwater is injected. Meltwater injections via the Mississippi and near Labrador hardly affect the AMOC. The reduced sensitivity of the overturning circulation against freshwater perturbations following the Mississippi route provides a consistent representation of the deglacial climate evolution. A subpolar North Atlantic Ocean freshening, mimicking a transport of water by icebergs, yields, on the other hand, a quasi-shutdown. We can conclude that millennial climate variability depends on the spatio-temporal structure and their representation in models.

Millennial DO-like variability is seen in a handful of model simulations, including even some pre-industrial simulations. As a mechanism, the subsurface is warmed by the downward mixing of the warmer overlying water during an AMOC weak state, until the surface became denser than at mid-depth and deep ventilation is initiated. In recent model developments, the large oscillations in the Labrador Sea mixing were reduced. However, it might be that the centennial-to-millennial oscillations are required to explain climate variability as expressed e.g. by the Little Ice age and the Medieval Warm Event during the last 1000 years. It could be that a de-tuning of the models is necessary in order to exhibit irregular oscillations on centennial-to-millennial time-scales. Although a systematical analysis of the mechanisms leading to centennial-to-millennial variability remains open, numerical experiments suggest that at least in the Labrador Sea and other sensitive areas the high resolution can play an important role in realistically simulating the variability in the mixed layer depth affecting AMOC. One can question regularities found in DO-events occurrence and statistically analyzed the distribution of inter-event waiting times. To estimate the statistical significance of detected event patterns, we construct a simple stochastic process in which events are triggered each time a threshold criterion is fulfilled. For a given time interval each event occurs therefore stochastically independent of another meaning that the probability of one abrupt warming does not affect the probability distribution of any other warming events in that interval. Additionally, novel periodicities of ∼900 and ∼1150 yrs in the NGRIP record are reported besides the prominent 1500 yrs cycle but demonstrate that although a high periodicity reflected in a high Rayleigh R can be found in the data it remains indistinguishable from a simple stationary random Poisson process. These are quite interesting findings as ∼1500 and ∼900 yrs periods are visible throughout the Holocene. The understanding of such low-frequency variability is crucial to allow a separation of anthropogenic signals from natural variability. 

How to cite: Lohmann, G., Butzin, M., Eissner, N., Shi, X., and Stepanek, C.: Centennial and millennial variability in models and data, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-11881, https://doi.org/10.5194/egusphere-egu2020-11881, 2020

Displays

Display file