Displays

CL3.1

One of the big challenges in Earth system science consists in providing reliable climate predictions on sub-seasonal, seasonal, decadal and longer timescales. The resulting data have the potential to be translated into climate information leading to a better assessment of multi-scale global and regional climate-related risks.
The latest developments and progress in climate forecasting on subseasonal-to-decadal timescales will be discussed and evaluated in this session. This will include presentations and discussions of predictions for a time horizon of up to ten years from dynamical ensemble and statistical/empirical forecast systems, as well as the aspects required for their application: forecast quality assessment, multi-model combination, bias adjustment, downscaling, etc.
Following the new WCPR strategic plan for 2019-2029, prediction enhancements are solicited from contributions embracing climate forecasting from an Earth system science perspective. This includes the study of coupled processes, impacts of coupling and feedbacks, and analysis/verification of the coupled atmosphere-ocean, atmosphere-land, atmosphere-hydrology, atmosphere-chemistry & aerosols, atmosphere-ice, ocean-hydrology, ocean-ice, ocean-chemistry and climate-biosphere (including human component). Contributions are also sought on initialization methods that optimally use observations from different Earth system components, on assessing and mitigating the impacts of model errors on skill, and on ensemble methods.
We also encourage contributions on the use of climate predictions for climate impact assessment, demonstrations of end-user value for climate risk applications and climate-change adaptation and the development of early warning systems.

A special focus will be put on the use of operational climate predictions (C3S, NMME, S2S), results from the CMIP5-CMIP6 decadal prediction experiments, and climate-prediction research and application projects (e.g. EUCP, APPLICATE, PREFACE, MIKLIP, MEDSCOPE, SECLI-FIRM, S2S4E).
An increasingly important aspect for climate forecast's applications is the use of most appropriate downscaling methods, based on dynamical or statistical approaches or their combination, that are needed to generate time series and fields with an appropriate spatial or temporal resolution. This is extensively considered in the session, which therefore brings together scientists from all geoscientific disciplines working on the prediction and application problems.

Share:
Co-organized by NP5/OS4
Convener: Andrea Alessandri | Co-conveners: Louis-Philippe Caron, Marlis HoferECSECS, June-Yi Lee, Xiaosong Yang
Displays
| Tue, 05 May, 14:00–15:45 (CEST)

Files for download

Download all presentations (90MB)

Chat time: Tuesday, 5 May 2020, 14:00–15:45

Chairperson: A. Alessandri, L-P. Caron, M. Hofer, X. Yang, J. Eden, Y. Chikamoto
D3418 |
EGU2020-6304
Enda Zhu and Xing Yuan

Terrestrial water storage (TWS), including surface water storage, soil water storage, and groundwater storage, is critical for the global hydrological cycle and freshwater resources. A reliable decadal prediction of TWS can provide valuable information for sustainable managements of water resources and infrastructures in the face of climate change. Generally, the hydrological predictability mainly comes from two sources, i.e., initial conditions and boundary conditions. To date, the dependence of TWS forecast skill on the accuracy of initial hydrological conditions and decadal climate forecasts is not clear, and the benchmark skill remains unknown. In this work, we use decadal climate hindcasts from CMIP and perform hydrological ensemble simulations to estimate a baseline decadal forecast skill containing the two predictability sources information for TWS over global major river basins with an elasticity framework that considers varying skill of initial conditions and climate forecasts. With the incorporation of decadal climate forecast, our benchmark skill for TWS incorporated is significantly higher than initial conditions-based forecast skill over 25% and 31% basins for the leads of 1–4 and 3–6 years, especially over mid- and high-latitudes. Although the decadal precipitation forecast skill based on individual model is limited, the ensemble forecasts from multiple climate models are better than individuals. In addition, the standardized precipitation index (SPI) predictability and forecast skill from the latest CMIP6 decadal hindcast data are being investigated. Preliminary results suggest that predictability and forecast skill of SPI are positively correlated in general, and the predictability is higher than forecast skill, indicating the room for improving hydro-climate forecast. Our findings provide a new benchmark for verifying the success of decadal TWS forecasts and imply the possibility of improving decadal hydrological forecasts by using dynamical climate prediction information which still has room for improvement.

D3419 |
EGU2020-20238
Francois Counillon, Noel Keenlyside, Mao-Lin Shen, Shunya Koseki, Marion Devilliers, Alok Gupta, and Gregory Duane

We present the first results from a supermodel constructed using three state-of-the-art earth system models: NorESM, CESM, MPIESM. A supermodel is an interactive ensemble in which models are optimally combined so that the systematic errors of the individual models compensate to achieve a model with superior performance. In the supermodel, the individual models are synchronized every month using data assimilation to handle the discrepancies of grid, resolution and variable representativity between the models. In particular, we assimilate a pseudo sea surface temperature (SST) that is computed as a weighted combination of the SST of the individual models. The synchronization of the models distinguishes this approach from the standard multi-model ensemble approach in which model outputs are combined a-posteriori. The data assimilation method used is the Ensemble Optimal Interpolation (EnOI) scheme, for which the covariance matrices are constructed from preindustrial control simulations of the individual models. The performances of a first version of the supermodel based on equal weights is compared to the individual models performances for the period 1980 to 2010. Synchronisation of the surface ocean is achieved in most places and dynamical regimes such as ENSO are occurring in phase. The biases of each model are reduced and the pathway of the Gulf Stream improved. The variability of the supermodel is not larger than in the super ensemble mean, but it is shown with an idealized model that the deflation is cause by a misconstruction of the pseudo observation and can be counteracted by perturbing them.  The Perspectives for performing predictions and climate change experiments with the supermodel method are presented and discussed.

D3420 |
EGU2020-198
Feba Francis, Ashok Karumuri, and Matthew Collins

 

Decadal Prediction is the prediction of climate for the next 5–20 years. Decadal Prediction has gained great importance as it tries to bridge the gap between seasonal and Centennial (50-100 year) predictions creating a balance between initial conditions and boundary conditions. We analysed the model output from CMIP5 decadal runs of nine models. Our results show that two of the decadal hindcasts show prediction skills of significance for the Indian Ocean Dipole for up to a decade. The Indian Ocean Dipole is one of the leading modes of climate variability in the tropics, which affects global climate. As already established, the models also show year-long lead predictability of the El Niño Southern Oscillation. We found no significant skills for the Indian Summer Monsoon. We are presently looking for the source of the lead predictability of Indian Ocean Dipole which appears to be due to links from the Southern Ocean. These decadal prediction skills and predictability for a climate driver like the Indian Ocean Dipole have immense helpfulness for climate science and society in general.

D3421 |
EGU2020-18283
Franco Catalano, Andrea Alessandri, Kristian Nielsen, Irene Cionni, and Matteo De Felice

Multi-model ensembles (MMEs) are powerful tools in dynamical climate prediction as they account for the overconfidence and the uncertainties related to single model ensembles. The potential benefit that can be expected by using a MME amplifies with the increase of the independence of the contributing Seasonal Prediction Systems. To this aim, a novel methodology has been developed to assess the relative independence of the prediction systems in the probabilistic information they provide.

We considered the Copernicus C3S seasonal forecasts product considering the one-month lead retrospective seasonal predictions for boreal summer and boreal winter seasons (1st May and 1st November start dates, i.e. June-July-August, JJA and December-January-February, DJF). We analysed the seasonal hindcasts in terms of deterministic and probabilistic scores with a particular focus on continental areas, since little evaluation has been performed so far over land domains that is where most of the applications of seasonal forecasts are based. The most relevant target variables of interest for the energy users have been considered and skill differences between the prediction systems have been analysed together with related possible sources of predictability. The analysis evidenced the importance of snow-albedo processes for temperature predictions in DJF and the effect of the atmospheric dynamics through moisture convergence for the prediction of surface solar radiation in JJA. A new metric, the Brier Score Covariance, designed to quantify the probabilistic independence among the models, has been developed and applied to optimize model selection and combination strategies with a particular focus on the most relevant variables for energy applications.

D3422 |
EGU2020-16129
Jonathan Eden and Bastien Dieppois

While there is a discernible global warming fingerprint in the increase observed daily temperature extremes, there is far greater uncertainty of the role played by anthropogenic climate change with regard to extreme precipitation. A logical progression of thought is that an increase in extreme precipitation results from the 7% increase in atmospheric moisture per 1°C global temperature increase predicted by the Clausius-Clapeyron (CC) relation.  While this is supported by observations on the global scale, rates of extreme precipitation at smaller spatial and temporal scales are influenced to a far greater extent by atmospheric circulation and vertical stability in addition to local moisture availability. Many of these processes and other features of extreme precipitation events are not sufficiently represented in general circulation model (GCM) simulations. Meanwhile, limited observational networks mean that many short-term convective events are not accurately represented in the observational data.  

Errors and biases are common to all global and regional climate models, and many users of climate information require some form of statistical correction to improve the usefulness of model output. As so-called bias correction has become commonplace in climate impact research, its development has been hastened by a sustained debate regarding model correction in general leading to techniques that merge statistical correction and downscaling, represent random variability using stochasticity and are explicitly applicable to extremes. To date, attribution of extreme precipitation has not fully utilised the tools available from recent advances in bias correction, stochastic postprocessing and statistical downscaling. In the same way that GCMs are the most important tool in making climate change projections, understanding the degree to which the nature of a particular weather event has changed due to global warming requires long-term simulations of global climate from the pre-industrial era to the present day.  The lack of a correction and/or downscaling step in almost all precipitation event attribution methodologies is therefore surprising. 

Here, we present a multi-scale attribution analysis of a sample of extreme precipitation events across Europe using a blend of observation- and model-based data. Attribution information generated using the raw output of global and regional climate model ensembles will be compared to that generated using the same set of models following a statistical postprocessing and downscaling step. Our conclusions will make recommendations for the value and wider application of downscaling methodologies in attribution science.

D3423 |
EGU2020-8869
| Highlight
Mattia Callegari, Valentina Cavedon, Alice Crespi, Felix Greifeneder, Marcello Petitta, Marc Zebisch, and Claudia Notarnicola

The prediction of seasonal water availability is a key element for an effective water storage management and hydropower production optimization. Here we propose a machine learning model for monthly water discharge prediction, which is based on statistical relationships between time series of a target, i.e. monthly water discharge, and predictors. The considered predictors can be divided into two classes: the initial catchment state variables and the seasonal forecast variables. Snow plays a crucial role as water storage component in alpine catchments. Thus, snow water equivalent is the predictor employed to describe the initial state of the catchment. To ensure the scalability of the method, snow water equivalent is represented here by ERA-5 climate reanalysis data (0.25° x 0.25° resolution). Depending on the prediction season, seasonal forecast of temperature can drive snowmelt or evapotranspiration, while precipitation provides a natural contribution to the total water availability. To describe these prediction variables, we employed a downscaled and bias-correction version of the ECMWF’s seasonal forecasting system (SEAS5) for temperature and precipitation. More specifically, the seasonal forecast fields were bilinearly downscaled from the original 1° x 1° resolution to the target ERA-5 grid and statistically corrected for bias in respect with ERA-5 data by means of a quantile mapping procedure. ERA-5 reanalysis data were used as reference for the bias-correction in order to allow the approach to be easily applied over different areas.

We tested the proposed method over an alpine catchment in Ulten Valley, South Tyrol, Italy, which is managed by three artificial reservoirs for hydropower production. For this catchment, a time series from 1992 to 2017 of measured daily water discharge is available. The water discharge prediction performances of the proposed method are compared with the ones obtained by considering the water discharge monthly climatology.

D3424 |
EGU2020-13841
| Highlight
Silvia Terzago, Filippo Calì Quaglia, Giulio Bongiovanni, Elisa Palazzi, and Jost von Hardenberg

The development of seasonal projections of the state of snow resources in the Alps is of particular interest for the management of water resources and tourism. We present the progress in the development of a modelling chain based on the seasonal forecast variables produced by seasonal prediction systems of the Copernicus Climate Change Service (C3S).

Seasonal forecast variables of precipitation, near-surface air temperature, radiative fluxes, wind and humidity are downscaled at three selected instrumented sites, close to five Alpine glaciers, in the North-Western Italian Alps, eventually bias-corrected and finally used as input for a physically-based multi-layer snowpack model (Snowpack; Lehning et al. 2012). A stochastic downscaling procedure is used for precipitation data in order to allow an estimate of uncertainties linked to small-scale variability in the forcing.

We evaluate uncertainties affecting the skill of the modelling chain in predicting the evolution of the winter snowpack in hindcast simulations, comparing against historical data of snow depth and snow water equivalent by automatic stations in the study areas.

The chain is tested considering seasonal forecast starting dates of November 1st, which are relevant for the snowpack processes. The sensitivity of the snow model to the accuracy of the input variables is discussed.

D3425 |
EGU2020-13455
June-Yi Lee, Kyung-Sook Yun, Arjun Babu, Young-Min Yang, Eui-Seok Chung, Hyo-Eun Oh, Axel Timmermann, and Kyung-Ja Ha

The Coupled Model Intercomparison Project Phase 5 (CMIP5) models have showed substantial inter-model spread in estimating annual global-mean precipitation change per one-degree greenhouse-gas-induced warming (precipitation sensitivity), ranging from -4.5–4.2%oC-1in the Representative Concentration Pathway (RCP) 2.6, the lowest emission scenario, to 0.2–4.0%oC-1in the RCP 8.5, the highest emission scenario. The observed-based estimations in the global-mean land precipitation sensitivity during last few decades even show much larger spread due to the considerable natural interdecadal variability, role of anthropogenic aerosol forcing, and uncertainties in observation. This study tackles to better quantify and constrain global land precipitation change in response to global warming by analyzing the new range of Shared Socio-economic Pathway (SSP) scenarios in the Coupled Model Intercomparison Project Phase 6 (CMIP6) compared with RCP scenarios in the CMIP5. We show that the range of projected change in annual global-mean land (ocean) precipitation by the end of the 21stcentury relative to the recent past (1995-2014) in the 23 CMIP6 models is over 50% (20%) larger than that in corresponding scenarios of the 40 CMIP5 models. The estimated ranges of precipitation sensitivity in four Tier-1 SSPs are also larger than those in corresponding CMIP5 RCPs. The large increase in projected precipitation change in the highest quartile over ocean is mainly due to the increased number of high equilibrium climate sensitivity (ECS) models in CMIP6 compared to CMIP5, but not over land due to different response of thermodynamic moisture convergence and dynamic processes to global warming. We further discuss key challenges in constraining future precipitation change and source of uncertainties in land precipitation change.

D3426 |
EGU2020-13784
Raffaele Bernardello, Valentina Sicardi, Pablo Ortega, and Francisco Doblas-Reyes

With the world population rapidly increasing and the related spectre of a global food crisis, the necessity to improve our ability to manage world's fisheries has never been more pressing. One important step in this direction is the improvement of near-term (i.e. seasonal to decadal) predictions of Net Primary Production (NPP). NPP is the rate of production of phytoplankton biomass, the primary source of food for marine animal life and thus a fundamental environmental variable to be taken into account in fishery management strategies. Here, we present results from a suite of simulations carried out with the Earth System Model EC-Earth3. These simulations include reconstructions of the biogeochemical state of the ocean for the period 1958 to present and a set of near-term predictions covering the period from 1979 to present. The simulations are designed to test the ability of two different initialization techniques to provide predictive skill to the simulation. One initialization technique is based on data-assimilation of physical fields only while, the second technique proposed is based on an attempt to partially reconstruct 3D nutrient fields. This combines information from climatological nutrient fields and reconstructed water masses variability. This combination is meant to exploit the ability of mode and intermediate water masses to propagate a signal on nutrient distribution on interannual timescales providing a source of predictability for nutrients and thus for NPP. Skill scores are used to validate these retrospective predictions derived from both techniques in order to obtain a complete evaluation of the predictive capability of the modelling system.

D3427 |
EGU2020-7599
Nuria Perez-Zanon, Louis-Philippe Caron, M. Carmen Alvarez-Castro, Lauriane Batté, Susanna Corti, Marta Dominguez, Federico Fabiano, Silvio Gualdi, Jost von Hardenberg, Llorenç Lledó, Nicolau Manubens, Paola Marson, Stefano Materia, Eroteida Sánchez, Bert Van Schaeybroeck, Verónica Torralba, Silvia Terzago, Deborah Verfaillie, and Danila Volpi

The availability of climate data has never been larger, as evidenced by the development of the Copernicus Climate Change Service. However, availability of climate data does not automatically translate into usability and sophisticated post-processing is often required to turn these climate data into user-relevant climate information allowing them to develop and implement strategies of adaptation to climate variability and to trigger decisions. 

Developed under the umbrella of the ERA4CS Medscope project by multiple European partners, here we present an R package currently in development, which aims to provide tools to exploit dynamical seasonal forecasts such as to provide information relevant to public and private stakeholders at the seasonal timescale. This toolbox, called CSTools (short for Climate Service Tools), contains process-based methods for forecast calibration, bias correction, statistical and stochastic downscaling, optimal forecast combination and multivariate verification, as well as basic and advanced tools to obtain tailored products. 

In addition to presenting some of the tools that are contained in the package, we also present a short overview of the development strategy adopted for this toolbox. The latter relies on a version controlling system established such as to allow scientists and developers to work within a common framework using a platform where they can exchange with other developers, test the various functionalities and discuss issues arising from the work, amongst other things. Furthermore, we will also present some vignettes, which are one of the mechanisms that allows users to understand and visualize the capabilities of CSTools. For instance, CSTools contains a step by step vignette showing how to use and visualize the output of MultivarRMSE, which gives an indication of the forecast performance (RMSE) for multiple variables simultaneously. 

While the extensive community of R users offers the opportunity of merging climate forecaster experts with final users, CSTools can also be used by other communities, such as Python users through the interface rpy. Finally, the publication of this package on CRAN (the Comprehensive R Archive Network) makes it easily accessible to interested users and ensures its proper functioning on different operational systems. 

D3428 |
EGU2020-20721
Xiaosong Yang, Thomas Delworth, Fanrong Zeng, William Cooke, Liping Zhang, and Gilbert Compo

Initializing climate models for decadal prediction is a major challenge, in part due to the lack of long-term subsurface ocean observations and the changing nature of observing systems. In order to overcome these limitations, we have developed a novel method for initializing a climate model for decadal prediction. Using GFDL’s next-generation prediction system, we developed a coupled ensemble data assimilation system, which assimilated only surface pressure observations, since the surface pressure measurements have been made since the late 1800s. Physically, by assimilating high-frequency surface pressure observations we constrain the model to experience a sequence of wind and storms, and thus surface fluxes, that is very similar to what is observed. The hypothesis is that by having the ocean component of the coupled model experience a very similar sequence of surface fluxes as observations, the ocean component of the coupled model will gradually reproduce the same variations as the observed system.

We assimilated the observed surface pressure station data used in the latest 20-century reanalysis. A coupled simulation during 1960 to 2016 has been completed. In this talk, we will review how well the observed decadal climate variations (e.g., PDO and AMO) can be reproduced solely from the surface pressure observations.  In addition, we will explore the multi-decadal variations of the Atlantic meridional overturning circulation (AMOC) and its connection with the North Atlantic sea surface temperature. The feasibility of using this method to initialize coupled climate models for realistic decadal predictions will be discussed in the talk.            

D3429 |
EGU2020-5439
Young-Min Yang

Observational analysis shows that there is a predominant global-scale multidecadal variability (GMV) of sea surface temperature (SST). Its horizontal pattern resembles that of the Interdecadal Pacific Oscillation (IPO) in the Pacific and the Atlantic multidecadal oscillation (AMO) in the Atlantic Ocean, which could affect global precipitation and temperature over globe. Here, we demonstrate that the GMV could be driven by the Atlantic multidecadal oscillation (AMO) through atmospheric teleconnections and atmosphere-ocean coupling processes. Observations reveal a strong negative correlation when AMO leads GMV by approximately 4–8 yrs. Pacemaker experiments using a climate model driven by observed AMO signals reveal that the tropical Atlantic warm SST anomalies of AMO initiate anomalous cooling in the equatorial central-eastern Pacific through atmospheric teleconnections. Anticyclonic anomalies in the North and South Pacific induce equatorward winds along the coasts of North and South America, contributing to further cooling. The upper ocean dynamics plays a minor role in GMV formation but contributes to a delayed response of the IPO to the AMO forcing. The possible impact of the GMV on AMO was also tested by prescribing only Pacific SST in the model, however, the model could not reproduce the observed phase relationship between the AMO and GMV. These results support the hypothesis that the Atlantic Ocean plays a key role in the multidecadal variability of global SST.

D3430 |
EGU2020-19803
Andrea Alessandri, Franco Catalano, Matteo De Felice, Kristian Nielsen, Alberto Troccoli, Marco Formenton, and Gaia Piccioni

A key objective of the Added Value of Seasonal Climate Forecasts for Integrated Risk Management Decisions (SECLI-FIRM, www.secli-firm.eu) project is the optimisation of the performance of seasonal climate forecasts provided by many producing centers, in a Grand Multi-Model approach, for predictands relevant for the specific case studies considered in SECLI-FIRM.

The Grand Multi-Model Ensemble (MME) consists of the five Seasonal Prediction Systems (SPSs) provided by the European Copernicus C3S and a selection of other five SPSs independently developed by centres outside Europe, four by the North American (NMME) plus the SPS by the Japan Meteorological Agency (JMA).

All the possible multi-model combinations have been evaluated showing that, in general, only a limited number of SPSs is required to obtain the maximum attainable performance. Although the selection of models that perform better is usually different depending on the region/phenomenon under consideration, it is shown that the performance of the Grand-MME seasonal predictions is enhanced with the increase of the independence of the contributing SPSs, i.e. by mixing European SPSs with those from NMME-JMA.

Starting from the definition of the Brier score a novel metric has been developed, named the Brier score covariance (BScov), which estimates the relative independence of the prediction systems. BScov is used to quantify independence among the SPSs and, together with probabilistic skill metrics, used to develop a strategy for the identification of the combinations that optimize the probabilistic performance of seasonal predictions for the study cases.

D3431 |
EGU2020-4572
Pierluigi Calanca

Stochastic weather generators are still widely used for downscaling climate change scenarios, in particular in the context of agricultural and hydrological impact assessments. Their performance is in many respects satisfactory, except perhaps for the fact that they fail to represent climatic variability in an adequate way. This has implications for the representation of extreme values and their statistics. Concerning precipitation, different approaches for amending this situation have proposed in the past, including using more sophisticated models to better simulate the persistence of wet and dry spells, conditioning rainfall-generating parameters on indices of the large-scale atmospheric circulation, or employing autoregressive models to represent year-to-year variations in annual precipitation amounts. With regard to (minimum and maximum) temperature, efforts to address the question of why weather generators underestimate total variability have been less systematic. Based on results obtained with a well-known weather generator (LARS-WG), this contribution aims to discuss which modes of variability are missing and why, elaborate on the implications of underrepresenting temperature variance for the simulation of temperature extremes in downscaled climate change scenarios, and suggest options to tackle the problem and improve the model performance.

D3432 |
EGU2020-10109
Alice Crespi, Mattia Callegari, Felix Greifeneder, Claudia Notarnicola, Marcello Petitta, and Marc Zebisch

The interest in trustable and accurate information about climate and its variability at local scale is currently increasing not only within the scientific community, but also by local stakeholders, political administrators and private companies. Clear, operative and close to the users’ needs climate information represent relevant support tools for a wide range of decision-making policies, including vulnerability assessment, risk management and energy production.

Seasonal forecasts, in particular, allow to provide predictions of the climate up to several months ahead and therefore they could represent precious sources of information for a wide range of activities, such as for the optimization of renewable energy sector. However, specific approaches are needed to deal with the probabilistic nature of seasonal forecasts and post-processing methods are required to adapt their large spatial resolution to the local scales of specific applications. This is particularly true for orographically complex areas, such as the Alpine regions, where coarse-resolution data could lead to remarkable under or overestimations in the predicted variables.

In this framework, we present a downscaled and bias-corrected version of seasonal forecasts provided by the ECMWF’s seasonal forecasting system (SEAS5) for temperature, precipitation and wind speed over the Alpine area and spanning the period 1983 – 2018. The approach is based on the bilinear interpolation of the 1°x1° original fields onto the target 0.25°x0.25° resolution and on the quantile-mapping procedure using ERA-5 reanalysis data for the calibration. The ERA-5 reanalysis dataset is chosen as reference in order to allow the application of the implemented scheme over different areas. The accuracy and skills of the post-processed seasonal forecast fields are evaluated, also in comparison with observations and the performance of alternative downscaling schemes.

The presented study supports the activities of the H2020 European project SECLI-FIRM on the improvement of the seasonal forecast applicability for energy production, management and assessment.

D3433 |
EGU2020-12271
Jialin Wang, Jing Yang, Hongli Ren, Jinxiao Li, Qing Bao, and Miaoni Gao

The seasonal prediction of summer rainfall is crucial for regional disaster reduction but currently has a low prediction skill. This study developed a machine learning (ML)-based dynamical (MLD) seasonal prediction method for summer rainfall in China based on suitable circulation fields from an operational dynamical prediction model CAS FGOALS-f2. Through choosing optimum hyperparameters for three ML methods to reach the best fitting and the least overfitting, gradient boosting regression trees eventually exhibit the highest prediction skill, obtaining averaged values of 0.33 in the reference training period (1981-2010) and 0.19 in eight individual years (2011-2018) of independent prediction, which significantly improves the previous dynamical prediction skill by more than 300%. Further study suggests that both reducing overfitting and using the best dynamical prediction are imperative in MLD application prospects, which warrants further investigation.

D3434 |
EGU2020-21094
Nikos Alexandris, Matteo Piccardo, Vasileios Syrris, Alessandro Cescatti, and Gregory Duveiller

The frequency of extreme heat related events is rising. This places the ever growing number of urban dwellers at higher risk. Quantifying these phenomena is important for the development and monitoring of climate change adaptation and mitigation policies. In this context, earth observations offer increasing opportunities to assess these phenomena with an unprecedented level of accuracy and spatial reach. Satellite thermal imaging systems acquire Land Surface Temperature (LST) which is fundamental to run models that study for example hotspots and heatwaves in urban environments.

Current instruments include TIRS on board Landsat 8 and MODIS on board of Terra satellites. These provide LST products on a monthly basis at 100m and twice per day at 1km respectively. Other sensors on board geostationary satellites, such as MSG and GOES-R, produce sub-hourly thermal images. For example the SEVIRI instrument onboard MSG, captures images every 15 minutes. However, this is done at an even coarser spatial resolution, which is 3 to 5 km in the case of SEVIRI. Nevertheless, none of the existing systems can capture LST synchronously with fine spatial resolution at a high temporal frequency, which is a prerequisite for monitoring heat stress in urban environments.

Combining LST time series of high temporal resolution (i.e. sub-daily MODIS- or SEVIRI-derived data) with products of fine spatial resolution (i.e. Landsat 8 products), and potentially other related variables (i.e. reflectance, spectral indices, land cover information, terrain parameters and local climatic variables), facilitates the downscaling of LST estimations. Nonetheless, considering the complexity of how distinct surfaces within a city heat-up differently during the course of a day, such a downscaling is meaningful for practically synchronous observations (e.g. Landsat-8 and MODIS Terra’s morning observations).

The recently launched ECOSTRESS mission provides multiple times in a day high spatial resolution thermal imagery at 70m. Albeit, recording the same locations on Earth every few days at varying times. We explore the associations between ECOSTRESS and Landsat-8 thermal data, based on the incoming radiation load and distinct surface properties characterised from other datasets. In our approach, first we upscale ECOSTRESS data to simulate Landsat-8 images at moments that coincide the acquisition times of other sensors products. In a second step, using the simulated Landsat-8 images, we downscale LST products acquired at later times, such as MODIS Aqua (ca. 13:30) or even the hourly MSG data. This composite downscaling procedure enables an enhanced LST estimation that opens the way for better diagnostics of the heat stress in urban landscapes.

In this study we discuss in detail the concepts of our approach and present preliminary results produced with the JEODPP, JRC's high throughput computing platform.

D3435 |
EGU2020-3776
Keith Dixon, Dennis Adams-Smith, and John Lanzante

We examine several springtime plant phenology indices calculated from a set of statistically downscaled daily minimum and maximum temperature projections. Multiple statistical downscaling methods are used to refine daily temperature projections from multiple global climate models (GCMs) run with multiple radiative forcing scenarios. Focusing on the northeastern United States, the statistically downscaled temperature projections are input to a commonly used Extended Spring Indices (SI-x) model, yielding yearly estimates of phenological indices such as First Leaf Date (an early spring indicator), First Bloom Date (a late spring indicator), and the occurrence of Late False Springs (a year in which a hard freeze occurs after first bloom, when plants are vulnerable to damage from freezing conditions). The matrix of results allows one to analyze how projected spring phenological index differences arising from the choice of statistical downscaling method (i.e., the statistical downscaling uncertainty) compare with the magnitudes of variations across the different GCMs (climate model uncertainty) and radiative forcing pathways (scenario uncertainty). As expected, the onset of spring in the late 21st century projections, as measured by First Leaf and First Bloom Dates, typically shifts multiple weeks earlier in the year compared with the historical period. Those two start-of-spring indices can be thought of as being largely, but not entirely, dependent on an accumulation of heat since 1 January. In contrast, a Late False Spring occurs in large part due to a short-term weather event - namely if any single day after the First Bloom Date has a minimum temperature below -2.2C. Accordingly, spring phenological indices calculated from statistically downscaled climate projections can be influenced by how well the GCM’s historical simulation represents temperature variations on different time scales (diurnal temperature range, synoptic time-scale temperature variability, inter-annual temperature variations) as well as how a particular statistical refinement method (e.g., a delta change factor method, a quantile-based bias correction method, or a constructed analog method) combines information gleaned from both the GCM time series and the observation-based training data to generate the statistically refined daily maximum and minimum temperature time series. Though this study is limited in scope (northeastern United States region, a finite set of statistical downscaling methods and GCMs), we believe the general findings likely are illustrative and applicable to a wider range of mid-latitude locations where plant responses in spring are mostly temperature and day length driven.

D3436 |
EGU2020-8533
Yan Ji, Xiefei Zhi, Ye Tian, Ting Peng, Ziqiang Huo, and Luying Ji

High spatial resolution weather forecasts that capture regional-scale dynamics are important for natural hazards prevention, especially for the regions featured with large topographical variety and local climate. While deep convolutional neural networks have made great progress in single image super-resolution (SR) which learns mapping relationship between low- and high- resolution images, limited efforts have been made to explore the potential of downscaling in this way. In the study, three advanced SR deep learning frameworks including Super-Resolution Convolutional Neural Network (SRCNN), Super-Resolution Generative Adversarial Networks (SRGAN) and Enhanced Deep residual networks for Super-Resolution (EDSR) are proposed for downscaling forecasts of daily precipitation in southeast China (100°E -130°E, 15°N -35°N). The SR frameworks are designed to improve the horizontal resolution of daily precipitation forecasts from raw 1/2 degrees (~50km) to 1/4 degrees (~25km) and 1/8 degrees (~12.5km), respectively. For comparison, Bias Correction Spatial Disaggregation (BCSD) as a traditional SD method is also performed under the same framework. The precipitation forecasts used in our work are obtained from different Ensemble Prediction Systems (EPSs) including ECMWF, NCEP and JMA which are provided by TIGGE datasets. A group of metrics have been applied to assess the performance of the three SR models, including Root Mean Square Error (RMSE), Anomaly Correlation Coefficient (ACC) and Equitable Threat Score (ETS). Results show that three SR models can effectively capture the detailed spatial information of local precipitation that is ignored in global NWPs. Among the three SR models, EDSR obtains the optimum results with lower RMSE and higher ACC which shows better downscaling skills. Furthermore, the SR downscaling methods can be extended to the statistical downscaling for other predictors as well.

D3437 |
EGU2020-13434
Antoine Doury, Samuel Somot, Sébastien Gadat, Aurélien Ribes, and Lola Corre

Statistical Emulators for Regional Climate Models: Preliminary results

Predicting some robust information on climate at some local geographical scale is of primary importance to assess the impact of the future climate change. But even more important is to quantify the whole range of uncertainties around the evolution of the climate that translates (i) the imperfections of the climate models, (ii) the natural variability variability and (iii) the uncertainties about the future human emissions of greenhouse gases. One of the nowadays tools used to produce future simulations at the local scale is the Regional Climate Models (RCM): they correspond to high resolution climate models used to downscale over a specific region the information simulated by a Global Climate Model (GCM) scenario simulation.

To cover the full range of uncertainties one should ideally force each RCM with every GCM under different emission scenarios and make several members. It comes down to filling up a huge 4D-matrix [Scenario, GCM, RCM, members]. However regarding the increasing number of climate models (regional and global) and the increasing cost of the RCMs due to their increased complexity and resolution, filling up such matrix becomes unrealistic.

To address this issue we propose a novel approach to merge statistical and dynamical downscaling techniques. The principle relies on three phases. Firstly, some RCM simulations are performed using the classical dynamical downscaling approach. Then, following the statistical downscaling principle, a statistical model is trained to learn the relationship between the large scale information given by the GCM and the local one produced by the RCM, using the runs previously performed. We call this statistical model an emulator. Finally this emulator allows to downscale more GCMs simulation, at a very reasonable cost in order to get a robust ensemble.

In this preliminary work we focus on emulating the surface temperature at the daily scale by testing different machine learning methods (RandomForest, Boosting, Neural Network) sometimes coupled with an a-priori signal decomposition. We train and test the emulator with simulations from the ALADIN RCM forced by the CNRM-CM5 GCM over the period 1950-2100. The different methods are discriminated over hidden simulations using skill scores measuring the match between the emulated series and the pseudo-reality RCM series. Day-to-day scores such as correlation or RMSE are used as well as statistical scores to control on the distribution of the predicted series.

D3438 |
EGU2020-12543
Margot Flemming and Richard Kelly

The spatial and temporal heterogeneity of seasonal snow and its impact on socio-economic and environmental functionality make accurate, real-time estimates of snow water equivalent (SWE) important for hydrological and climatological predictions. Remote sensing techniques facilitate a cost effective, temporally and spatiallyconsistent approach to SWE monitoring in areas where insitu measurements are notsufficient. Passive microwave remote sensing has been used to successfully estimate SWE globally by measuring the microwave attenuation from the Earth’s surface as a function of SWE. However, passive microwave derived SWE estimates at local scales are subject to large errors given the coarse spatial resolution of observations (~625 km2).Regression downscaling techniques can be implemented to increase the spatial resolution of gridded datasets with the use of related auxiliary datasets at a finer spatial resolution. These techniques have been successfully implemented to remote sensing datasets such as soil moisture estimates, however, limited work has applied such techniques to snow-related datasets.This study focuses on assessing the feasibility of using regression downscaling to increase the spatial resolution of the European Space Agency’s (ESA) Globsnow SWE product in the Red River basin, an agriculturally important region of the northern United States.

Prior to downscaling Globsnow SWE, three regression downscaling techniques (Multiple Linear Regression, Random Forest Regression and Geographically Weighted Regression) were assessed in an internal experiment using 1 km grid scale Snow Data Assimilation System (SNODAS) SWE estimates, developed by the National Weather Service’s National Operational Hydrological Remote Sensing Center (NOHRSC). SNODAS SWE estimates for 5-year period between 2013-2018 were linearly aggregated to a 25 km grid scale to match the Globsnow spatial resolution. Three regression downscaling techniques were implemented along with correlative datasets available at the 1 km grid scale to downscale the aggregated SNODAS data back to the original 1 km grid scale spatial resolution. When compared with the original SNODAS SWE estimates, the downscaled SWE estimates from the Random Forest Regression performed the best. Random Forest Regression Downscaling was then implemented on the original Globsnow SWE data for the same time period, as well as a corrected Globsnow SWE dataset. The downscaled SWE results from both the corrected and uncorrected Globsnow data were validated using the original SNODAS SWE estimates as well as in situ SWE measurements from a set of 40-45 (depending on the season) weather stations within the study region. Spatial and temporal error distributions were assessed through both validation datasets. The downscaled results from the corrected Globsnow dataset showed similar overall statistics to the original SNODAS SWE estimates, performing better than the downscaled results from the uncorrected Globsnow SWE dataset. The overall aim of this study is to assess the applicability of regression downscaling as a reliable and reproducible method for local scale SWE estimation in areas where finer resolution data such as SNODAS does not exist. Therefore, the goal is to reproduce the optimal regression downscaling procedure in an area other snow dominated regions across the globe using in situ snow transect data for validation.

D3439 |
EGU2020-17561
| Highlight
Giovanni Sgubin, Didier Swingedouw, Juliette Mignot, Leonard Borchert, Thomas Noël, and Harilaos Loukos

Reliable climate predictions over a time-horizon of 1-10 year are crucial for stakeholders and policymakers, as it is the time span for relevant decisions of public and private for infrastructures and other business planning. This promoted, about a decade ago, the development of a new family of climate model: the Decadal Climate Predictions (DCP). Similarly to climate projections, the DCP consists in forced simulations of climate, but initialised from a specific observed climatic state, which potentially represents an added value. Being a relatively new branch of climate modelling the effective application of DCP to impact analysis supporting operational adaptation measures is still conditional on their evaluation.

Here we contribute to this evaluation by exploring the performance of the IPSL-CM5A-LR DCP system in predicting the air temperature over Europe.  Our assessment of the potentiality of the DCP system follows two main steps: (1) the comparison between the simulated large-scale air temperature from hindcasts and the observations from mid-1900 to present day, i.e. NOAA-20CR dataset, which defines a prediction skill, calculated through both the Anomaly Correlation Coefficient (ACC) and the Root Mean Square Error (RMSE); (2) the detection of the “windows of opportunity”, i.e. specific conditions under which the DCP performs better. The exploration of the windows of opportunity stems from a systematic detection that evaluates the DCP skills for each combination of periods, lead times and seasons. Our analysis involves both raw simulations and de-biased simulations, i.e. outputs data that have been adjusted through the quantile-quantile method.

Our results evidence a significant added value over most of Europe with respect to non-initialised historical simulations.  Significant skill scores have been generally found over the Mediterranean sector of Europe and UK, while the performance over the rest of Europe results rather conditional on the season and on the period considered. The best predicted months appear to be those between spring and autumn, while low skills have been found for winter months. Also, the predictions appear to be more performant after the ’80, when a rapid warming signal characterised the temperature over Europe: this shift is well reproduced in the initialised simulations. Finally, skill anomalies between raw and debiased outputs are generally minimal. Nevertheless, debiased data show an overall higher RMSE skill, while ACC skill appears to be slightly higher in winter and slightly lower in summer. These findings may be useful for the exploitation of the IPSL DCP for near-term timescale impact analysis over Europe. Also, our systematic approach for the exploration of the windows of opportunity may be at the base of similar investigations applied to other DCP systems.

D3440 |
EGU2020-11389
| Highlight
Marc Lemus-Canovas and Swen Brands

Mountain areas are one of the most vulnerable areas to climate change, due to the large amount of natural resources they contribute to society. Moreover, the announced increase in temperature for the next few decades may have uncertain consequences for the ecosystems and landscapes of such territories. To face this challenge, it is necessary to test the capacity to simulate the climate of warm periods using observed data. In the present contribution, different perfect prog (PP) downscaling methods were evaluated to simulate the minimum and maximum daily temperature in a 1x1 km grid in the Pyrenees (Spain, France & Andorra) for the period 1985-2015. To obtain the results, several combinations of predictors, different geographical domains of such predictors, as well as different reanalysis databases were used, to check how much they can influence the prediction skill. In addition, different metrics were calculated to evaluate the bias, the similarity in the observed and predicted distributions, the temporal correlation, etc.

The results obtained reflect that the regression models better represent the warm periods using the observed data, as well as a lower bias. The present study will facilitate the decision making on which method of downscaling PP is more useful to reproduce the future temperature in the Pyrenees.

 

Keywords: Statistical downscaling, perfect prog, Pyrenees, daily temperature.

D3441 |
EGU2020-13752
Bastien François, Mathieu Vrac, Alex Cannon, Yoann Robin, and Denis Allard

Climate models are the major tools to estimate climate variables evolutions in the future. However, climate simulations often present statistical biases and have to be corrected against observations before being used in impact assessments. Several bias correction (BC) methods have therefore been developed in the literature over the last two decades, in order to adjust simulations according to historical records and obtain climate projections with appropriate statistical attributes. Most of the existing and popular BC methods are univariate, i.e., correcting one physical variable and one location at a time, and thus can fail to reconstruct inter-variable, spatial or temporal dependencies of the observations. These remaining biases in the correction can then affect the subsequent analyses. This has led to further research on multivariate aspects for statistical postprocessing BC methods. Recently, some multivariate bias correction (MBC) methods have been proposed, with different approaches to restore multidimensional dependencies. However, these methods are not well apprehended yet by researchers and practitioners due to differences in their applicability and assumptions, therefore leading potentially to different results. This study is intended to intercompare four existing MBCs to provide end-users with aid in choosing such methods for their applications. For evaluation and illustration purposes, these methods are applied to correct simulation outputs from one climate model through a cross-validation methodology, which allows for the assessment of inter-variable, spatial and temporal criteria. Then, a second methodology is performed for assessing the ability of the MBC methods to account for the multi-dimensional evolutions of the climate model. Additionally, two reference datasets are used to assess the influence of their spatial resolution on (M)BC results. Most of the methods reasonably correct inter-variable and inter-site correlations. However, none of them adjust correctly the temporal structure as they generate bias corrected data with usually weak temporal dependencies compared to observations. Major differences are found concerning the applicability and stability of the methods in high-dimensional contexts, and in their capability to reproduce the multi-dimensional changes of the model. Based on those conclusions, perspectives for MBC developments are suggested, such as methods to adjust not only multivariate correlations but also temporal structures and allowing to account for multi-dimensional evolutions of the model in the correction.

D3442 |
EGU2020-13211
Katharina Klehmet, Peter Berg, Pascual Herrera, David Leidinger, Anthony Lemoine, Ernesto Pasten-Zapata, and Rafael Pimentel

It is common practice to apply some form of bias correction to climate models before use in impact modelling, such as hydrology. The standard method is to evaluate the correction method based on a cross validation procedure with two or more sub-periods. This allows the method to be assessed on data not previously seen in the calibration step. However, with standard split-sample setups, the data is most likely in a similar climate regime as the calibration data. In effect, the method is evaluated in the same climate regime as it is calibrated, and informs little about the performance outside the current climate regime.

To address this issue, a discrete split sample test (DSST) was set up so that as diverse climate regimes as possible were sampled. The simplest climate analogue would be to perform the DSST on the coldest years and evaluate on the warmest, to mimic a changing temperature. Here, the tests are extended to more exotic indicators, such as snow pack, the joint probability of wet and cold seasons, the number of hot days in a year, the convective activity during summer; all related to a specific case study issue. Six different bias correction methods of both standard quantile mapping and other approaches to scale the reference time series are included. The methods are applied in a pseudo-reality framework to six climate model projections from Euro-CORDEX 12.5 km simulations. The analysis is focused on comparing the DSST performance with the impact on the climate change signals, and to the reliability of each method when applied to different climate regimes.

D3443 |
EGU2020-8024
Maeng-Ki Kim, Jeong Sang, Ji-hyun Yun, and Ji-Seon Oh

In this study, we produced grid climate data sets of 1km×1km and 5km×5km horizontal resolutions based on MK (Modified Korean)-PRISM (Parameter-elevation Regressions on Independent Slopes Model), a statistical method that can estimate grid data of horizontal high-resolution using observational station data in Korea. To compare the MK-PRISM performance according to resolution, RMSEs of 1km resolution data and 5km resolution data were calculated and analyzed. The RMSEs of the two data sets were similar, but the results classified according to the elevation were different. The 1km high resolution estimated data was shown to better reflect the impact of the terrain for the daily mean temperature and daily maximum temperature, whereas the difference between the two data sets for daily minimum temperature was not statistically significant at each elevation. Furthermore, we also divided the temperature data into 9-classes based on the observed temperatures, and then compared the estimated performance of the two data sets according to elevation. For the low temperature group, performance of the 1 km resolution data at high elevations outperformed that of the 5 km resolution data, regardless of the season. In addition, we have verified the improved PRIDE (PRism based Dynamic downscaling Error correction) model, which can produce future high-resolution scenarios data using the results of RCM and MK-PRISM.

D3444 |
EGU2020-6697
Katarina Kosovelj and Nedjeljka Žagar

The assessment of climate model biases in an important part of their validation, in particular with respect to the application of the outputs of global models as lateral boundaries in regional climate models. The coupled nature of thermodynamics and circulation asks for their simultaneous treatment in the model bias analysis. This can be achieved by applying the normal-mode decomposition of model outputs and reanalysis that provides biases associated with the two dominant atmospheric regimes, the Rossby (or balanced) and inertia-gravity (or unbalanced) regime. The regime decomposition provides the spectrum of bias in terms of zonal wavenumbers, meridional modes and vertical modes. This can be especially useful in the tropics, where the Rossby and IG regimes are difficult to separate and biases in simulated circulation, just like the circulation itself, have global impacts. 

The method is applied to the intermediate complexity climate model SPEEDY. Fifty-year long simulations  are performed in AMIP-mode with the prescribed SST. Biases are computed with respect to ERA-20C  upscaled to the resolution of SPEEDY (T30L8). We evaluate biases both in modal and physical space and study regional biases associated with the  balanced and unbalanced components of circulation. This work thus expands the results presented by Žagar et al. (2019, Clim. Dyn.) to the two regimes-related bias analysis..

The results show that the bias is strongly scale dependent, just like the simulated variability. The largest biases in SPEEDY are at planetary scales (waveumbers 0-3). Biases associated with the extratropical Rossby modes explain more than the half of bias variance. The Rossby n=1 mode is a single mode with the largest bias variance in balanced circulation whereas the Kelvin wave contains the largest bias among the IG modes. These biases are shown to originate mostly in the stratosphere and the upper-troposphere westerlies in the Southern hemisphere. 

D3445 |
EGU2020-10513
Narges Khosravi, Nikolay Koldunov, Qiang Wang, Sergery Danilov, Claudia Hinrichs, Tido Semmler, and Thomas Jung

We examined the Arctic Atlantic Water (AW) layer in the CMIP6 models. Climatological means of temperature and salinity at 400 m depth from multi-model averages are compared with observations, showing significant biases in both variables. Based on the currently available data, we showed that the CMIP6 models have cold and fresh biases in the Arctic AW layer, and warm and saline biases in the East Greenland Current. The temperature biases are comparable to the climate signal magnitude for temperature, predicted by the CMIP6 models for the end of the 21st century. For salinity, the biases are shown to be even more pronounced than the predicted signals. CMIP6 models also show positive sea-level pressure (SLP) and sea-surface height (SSH) biases in the Nordic Seas. We argue that the identified SLP bias leads to an anomalously weak cyclonic gyre circulation in the Nordic seas, as shown through positive SSH bias. This could cause weaker AW inflow through the Fram Strait, which explains the detected hydrography biases in the AW layer. While we do not rule out other possible factors contributing to the weak AW flow to the Arctic Ocean, we suggest that the identified ocean biases within the CMIP6 models are at least partially driven by atmospheric origins.

D3446 |
EGU2020-11561
Ana Casanueva, Sixto Herrera, Maialen Iturbide, Stefan Lange, Martin Jury, Alessandro Dosio, Douglas Maraun, and José M. Gutiérrez

Systematic biases in climate models hamper their direct use in impact studies and, as a consequence, many bias adjustment methods, which merely correct for deficiencies in the distribution, have been developed. Despite adjusting the desired features under historical simulations, their application in a climate change context is subject to additional uncertainties and modifications of the change signals, especially for climate indices which have not been tackled by the methods. In this sense, some of the commonly-used bias adjustment methods allow changes of the signals, which appear by construction in case of intensity-dependent biases; some others ensure the trends in some statistics of the original, raw models. Two relevant sources of uncertainty, often overlooked, which bring further uncertainties are the sensitivity to the observational reference used to calibrate the method and the effect of the resolution mismatch between model and observations (downscaling effect).

In the present work, we assess the impact of these factors on the climate change signal of a set of climate indices of temperature and precipitation considering marginal, temporal and extreme aspects. We use eight standard and state-of-the-art bias adjustment methods (spanning a variety of methods regarding their nature -empirical or parametric-, fitted parameters and preservation of the signals) for a case study in the Iberian Peninsula. The quantile trend-preserving methods (namely quantile delta mapping -QDM-, scaled distribution mapping -SDM- and the method from the third phase of ISIMIP -ISIMIP3) preserve better the raw signals for the different indices and variables (not all preserved by construction). However they rely largely on the reference dataset used for calibration, thus present a larger sensitivity to the observations, especially for precipitation intensity, spells and extreme indices. Thus, high-quality observational datasets are essential for comprehensive analyses in larger (continental) domains. Similar conclusions hold for experiments carried out at high (approximately 20km) and low (approximately 120km) spatial resolutions.