CL5.1.5

EDI
Downscaling: methods, applications and added value

Downscaling aims to process and refine global climate model output to provide information at spatial and temporal scales suitable for impact studies. In response to the current challenges posed by climate change and variability, downscaling techniques continue to play an important role in the development of user-driven climate information and new climate services and products. In fact, the "user's dilemma" is no longer that there is a lack of downscaled data, but rather how to select amongst the available datasets and to assess their credibility. In this context, model evaluation and verification is growing in relevance and advances in the field will likely require close collaboration between various disciplines.

Furthermore, epistemologists have started to revisit current practices of climate model validation. This new thread of discussion encourages to clarify the issue of added value of downscaling, i.e. the value gained through adding another level of complexity to the uncertainty cascade. For example, the ‘adequacy-for-purpose view’ may offer a more holistic approach to the evaluation of downscaling models (and atmospheric models, in general) as it considers, for example, user perspectives next to a model’s representational accuracy.

In our session, we aim to bring together scientists from the various geoscientific disciplines interrelated through downscaling: atmospheric modeling, climate change impact modeling, machine learning and verification research. We also invite philosophers of climate science to enrich our discussion about novel challenges faced by the evaluation of increasingly complex simulation models.

Contributions to this session may address, but are not limited to:

- newly available downscaling products,
- applications relying on downscaled data,
- downscaling method development, including the potential for machine learning,
- bias correction and statistical postprocessing,
- challenges in the data management of kilometer-scale simulations,
- verification, uncertainty quantification and the added value of downscaling,
- downscaling approaches in light of computational epistemology.

Co-organized by NP3
Convener: Marlis HoferECSECS | Co-conveners: Jonathan Eden, Tanja ZerennerECSECS
Presentations
| Fri, 27 May, 14:05–16:40 (CEST)
 
Room 0.49/50

Presentations: Fri, 27 May | Room 0.49/50

Chairperson: Jonathan Eden
Epistemic perspective
14:05–14:15
|
EGU22-8086
|
solicited
|
Highlight
|
Virtual presentation
Wendy Parker

How should downscaling methods, and the products of downscaling, be evaluated? An adequacy-for-purpose approach attempts to determine whether a method or product can be used successfully for specific purposes of interest. Purposes can take various forms: predicting variable X in region R with a specified level of accuracy, guiding a particular policy decision, etc. Depending on the purpose, different tests or checks will be performed and different levels of performance, on different metrics, will be deemed acceptable. A product that is grossly inaccurate in some respects may nevertheless be entirely adequate for the purpose at hand. Likewise, higher-resolution products are not necessarily more adequate (or fit); it depends on whether they provide the information required for the purpose of interest and in a way that can be interpreted and employed by users.

How to cite: Parker, W.: Evaluating the adequacy-for-purpose of downscaling methods and products, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-8086, https://doi.org/10.5194/egusphere-egu22-8086, 2022.

RCM-based downscaling and hybrid approaches
14:15–14:21
|
EGU22-11296
|
ECS
|
On-site presentation
Zhuyun Ye, Ulas Im, Jesper Christensen, Camilla Geels, Marit Sandstad, Carley Iles, and Clemens Schwingshackl

In frame of the H2020-EXHAUSTION project, we present estimated heat indicators from WRF (Weather Research and Forecasting model) downscaling simulations for the periods of 1980-2014 and 2015-2049 at 20 km horizontal resolution over the European domain. WRF simulations are forced by the CESM2 global model simulations, using three shared socio-economic pathways (SSP) future scenarios from the Coupled Model Intercomparison Project Phase 6 (CMIP6): SSP1-2.6, SSP2-4.5 and SSP3-7.0, addressing different levels of mitigation and adaptation. For the period of 1980-2014, another WRF simulation forced by ERA5 is used as comparison in model validation. These near-past simulations have been rigorously evaluated with observations and reanalysis data including European Climate Assessment & Dataset (ECA&D), E-OBS, and ERA5-land for the surface air temperatures. The dynamical downscaling showed clear added value on spatial distribution related to the important coastal or orographic aspects widely present over Europe. Two heat wave indicators, the Warm Spell Duration Index (WSDI) and the Heat Wave magnitude Index daily (HWMId), and four extreme heat indicators, annual maximum temperature (TX­x), NOAA heat index (HIx), wet-bulb globe temperature (WBGTx), and universal thermal climate index (UTCIx), are used to study the heat extremes trends in Europe. During the past 35 years, TXx has been estimated to increase 2.5 °C in WRF_CESM2 and 1.4 °C in WRF_ERA5; the increasing trend is estimated to remain or slow down slightly in the next 35 years with estimated smaller increase of 1.5-2.5 °C in three scenarios. The trends of other extreme heat indicators showed very similar trends with TXx. However, future heat wave duration and magnitude present a contrasted pattern. Heat waves have been estimated to increase 11.2 days of duration, and 2.1 of magnitude during 1980-2014, very similar to the observed increase of 9.1-11 days and 1.8-2.1. Whereas in 2015-2049, heat waves duration and magnitude are estimated to increase 12.3-13 days and 2.5-4.6, respectively. These heat wave changes are also not uniform from a spatial point of view. Heat wave duration and magnitude in Southern Europe are both estimated to increase significantly faster than other zones, with rates at 1.4-2.9 times of which for the whole of Europe. Heat wave indicators in future scenarios also showed much larger interannual variations compared with the past, whereas there are no distinct differences among three mitigation scenarios for all heat indicators. In summary, these results suggested that even though the future increase of air temperatures and heat extreme indicators showed a slowing down sign compared with the near-past, whereas the severity of heat waves are estimated to increase even faster than the past under different levels of mitigation. Southern Europe is expected to be the region that needs the most attention in terms of severe future heat waves.

How to cite: Ye, Z., Im, U., Christensen, J., Geels, C., Sandstad, M., Iles, C., and Schwingshackl, C.: Near-past and future trends of European extreme heat and heat waves from WRF downscaling experiments, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-11296, https://doi.org/10.5194/egusphere-egu22-11296, 2022.

14:21–14:27
|
EGU22-1043
|
ECS
|
Highlight
|
Virtual presentation
María Ofelia Molina, Joao Careto, Claudia Gutiérrez, Enrique Sánchez, and Pedro Soares


In the context of the CORDEX project, an ensemble of regional climate simulations of high resolution on a 0.11º grid has been generated for Europe with the objective of improving the representation of regional to local-scale atmospheric phenomena. However, such simulations are computationally expensive and do not always reveal added value.

In this study, a recently proposed metric (the distribution added value, DAV) is used to determine the added value of the available EURO-CORDEX high-resolution simulations at 0.11º for daily mean wind speed compared to their coarser-gridded 0.44º counterparts and their driving global simulations, from hindcast and historical experiments. The analysis is performed using observations data as a reference. Furthermore, the use of a normalized metric allows for a spatial comparison among different regions and time periods.

In general, results show that RCMs add value to their forcing model or reanalysis, but the nature and magnitude of the improvement on the representation of wind speed vary depending on the model, the region or the season. We found that most RCMs at 0.11º outperform models at 0.44º resolution in terms of their quality to represent measured wind speed PDF. However, the benefits of downscaling are not as clear in the upper tail of the wind speed.
At regional scale, higher DAVs are obtained at 0.11º than 0.44º resolution for all subdomains studied, particularly over the Mediterranean, the Iberian Peninsula and the Alps. With altitude, the 0.11º models represent better the locations below 50 m and above 350 m, while the 0.44º models under-perform with increasing altitude. Overall, DAVs are higher at 0.11º than at 0.44º resolution, probably due to a better performance of local-scale feedbacks at high resolution.

How to cite: Molina, M. O., Careto, J., Gutiérrez, C., Sánchez, E., and Soares, P.: The added value of high-resolution EURO-CORDEX simulations to describe daily wind speed over Europe, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-1043, https://doi.org/10.5194/egusphere-egu22-1043, 2022.

14:27–14:33
|
EGU22-7049
|
ECS
|
Highlight
|
Presentation form not yet defined
Lucas Pfister, Peter Stucki, Andrey Martynov, and Stefan Brönnimann

One year after the eruption of the Tambora volcano, the “Year Without a Summer” of 1816 was characterised by extraordinarily cold periods in (Central) Europe, and it was associated with severe crop failures, food shortages, famine and socio-economic disruptions.

The summer of 1816, has been analysed based on a number of early meteorological measurements, as well as on ample documentary information. A statistical reconstruction of spatial fields with daily resolution has been conducted for Switzerland. However, this dataset encompasses only a limited set of variables. In turn, dynamical downscaling methods allow to reconstruct past weather on a higher temporal and spatial resolution. In our work, we simulate a particularly cold episode in June 1816 by downscaling data from the Twentieth Century Reanalysis version 3 (20CRv3). The simulation uses the Weather Research and Forecasting (WRF) model with three nested domains for the greater Alpine region and provides hourly output with a 3-km resolution. In addition, we include recently digitised station series of temperature and pressure for a Three-Dimensional Variational (3DVAR) data assimilation in the innermost domain. Results are then validated against independent station observations.

First results suggest that dynamical downscaling and data assimilation may become a promising approach to obtain physically consistent information on past weather on a local and subdaily scale. This may hold even for extreme events in an era with a scarce network of instrumental weather observations compared to today, although erroneous results may occur. A successful application of dynamical downscaling and data assimilation for the early 19th century might open the door for a regional atmospheric reanalysis product that covers the last two centuries.

How to cite: Pfister, L., Stucki, P., Martynov, A., and Brönnimann, S.: Dynamical Downscaling and Data Assimilation: Insights from the Case Study of the "Year Without a Summer" 1816, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-7049, https://doi.org/10.5194/egusphere-egu22-7049, 2022.

14:33–14:39
|
EGU22-6745
|
ECS
|
Virtual presentation
Zeyu Xue and Paul Ullrich

As the most severe drought over the Northeastern United States (NEUS) in the past century, the 1960s drought had pronounced socioeconomic impacts. Although a persistent wet period followed, the conditions driving the 1960s extreme drought could return in the future, along with its challenges to water management. To project the potential consequences of such a future drought, pseudo-global warming (PGW) simulations using the Weather Research and Forecasting Model are performed to simulate the dynamical conditions of the historical 1960s drought, but with modified thermodynamic conditions under the shared socioeconomic pathway SSP585 scenario in the early (2021-2027), middle (2041-2047) and late (2091-2097) 21st century. Our analysis focuses on essential hydroclimatic variables including temperature, precipitation, evapotranspiration, soil moisture, snowpack and surface runoff. In contrast to the historical 1960s drought, similar dynamical conditions will generally produce more precipitation, increased soil moisture and evapotranspiration, and reduced snowpack. However, we also find that although wet months do become much wetter, dry months also may become drier, meaning that wetting trends that are significant in wet months can be essentially negligible for extremely dry months. For these months, the trend towards wetter conditions provides little relief from drying. These conditions may even aggravate water shortages due to an increasingly rapid transition from wet to dry conditions. Other challenges emerge for residents and stakeholders in this region, including more extreme hot days, record-low snow pack, frozen ground degradation and subsequent decreases in surface runoff.

Although the PGW approach pursued in this study is analogous to other recent studies, there is also a pressing need to ascertain confidence in projections using the PGW method.  Most PGW studies only modify the temperature forcing since it is the most significant for driving impacts on climate, but other meteorological forcings may also impact regional climate trends. For example, the large geopotential height increments at higher atmosphere levels tend to increase the stability and weaken the precipitation events associated with typhoons.  PGW studies usually only consider the changes at the regional mean scale but ignore spatially-dependent contributions from climate change. Therefore, in order to investigate the sensitivity of PGW-based projections, additional simulations were conducted under the RCP8.5 emission scenario but with different forcing modification methods.  We thus answer three questions: Are PGW simulations sensitive to the spatial scale of climate perturbations? Besides temperature, which climatological variables are crucial to PGW simulations? And finally, how should researchers design and conduct their PGW simulations?

How to cite: Xue, Z. and Ullrich, P.: PGW projections of the returned 1960s U.S. Northeast drought and sensitivity examinations of PGW methods, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-6745, https://doi.org/10.5194/egusphere-egu22-6745, 2022.

14:39–14:45
|
EGU22-7306
|
Highlight
|
Virtual presentation
Julien Boé and Alexandre Mass

To characterize the impacts of climate change, robust high-resolution climate change information is generally needed. The resolution of global climate models is currently too coarse to provide directly such information. A specific spatial downscaling step is therefore generally needed, either (1) dynamical downscaling with regional climate models or (2) statistical downscaling.

In this study, we present a new hybrid statistical-dynamical downscaling approach, intended to combine the respective strengths of statistical and dynamical downscaling, while overcoming their respective limitations. This hybrid method aims to emulate regional climate models and is based on a constructed analogues method.

Contrary to dynamical downscaling, the computational cost of the method is low, allowing to downscale a large number of global climate projections and therefore to correctly assess the climate uncertainties in impact studies. Contrary to statistical downscaling, the method does not rely on the assumption that the downscaling relationship established in the present climate with observations remains valid in the future climate perturbed by anthropogenic forcings. Therefore, the hybrid approach should be as robust as regional climate models in projecting future climate change.

In this presentation, the hybrid statistical-dynamical downscaling method is first presented. Elements of evaluation, in a perfect model framework based on an ensemble of regional climate models over Europe, are then shown and discussed to demonstrate the interest of the method and its applicability to study future climate changes over western Europe. Finally, results of the application of the method to downscale global climate projections over western Europe are shown, and important implications of the results are discussed.   

How to cite: Boé, J. and Mass, A.: A hybrid statistical-dynamical method to downscale global climate models over Europe, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-7306, https://doi.org/10.5194/egusphere-egu22-7306, 2022.

Coffee break
Chairperson: Jonathan Eden
Machine learning approaches
15:10–15:16
|
EGU22-6150
|
ECS
|
Highlight
|
Virtual presentation
Henry Addison, Peter Watson, Laurence Aitchison, and Elizabeth Kendon

Climate change is causing the intensification of rainfall extremes in the UK [1]. Physics-based numerical simulations for creating precipitation projections are computationally expensive and must be run many times to quantify the natural variability of precipitation. Local-scale projections such as those from the Met Office's 2.2km convection-permitting model are possible [2] but the computational expense of these simulations requires trade-offs in the duration, domain size, ensemble size and emission scenarios for which to produce projections [1]. 

Here, we apply state-of-the-art machine learning methods to predict precipitation from the 2.2km model given large-scale predictors that are represented in GCMs. By conditioning on outputs from a physical model, rainfall can be downscaled in both past and future climates. We test the extent these methods can reproduce the complex spatial and temporal structure of rainfall, with which past statistical approaches struggle. We are interested in the methods’ ability to capture the distribution of extreme rainfall and to reproduce extreme events. Our methods are neural-network-based and explore generative approaches for representing the stochastic component of high-resolution precipitation. Compared to physical models, these approaches are computationally much cheaper and have a simple interface allowing them to be used to downscale other large GCM datasets. 

References 

[1] Kendon, E. J. et al. (2021). Update to the UKCP Local (2.2km) projections. Science report, Met Office Hadley Centre, Exeter, UK. [Online]. Available: https://www.metoffice.gov.uk/pub/data/weather/uk/ukcp18/science-reports/ukcp18_local_update_report_2021.pdf. 

[2] Met Office Hadley Centre. (2019). UKCP18 Local Projections at 2.2km Resolution for 1980-2080, Centre for Environmental Data Analysis. [Online]. Available: https://catalogue.ceda.ac.uk/uuid/d5822183143c4011a2bb304ee7c0baf7.

How to cite: Addison, H., Watson, P., Aitchison, L., and Kendon, E.: Downscaling UK rainfall using machine-learning emulation of a convection-permitting model, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-6150, https://doi.org/10.5194/egusphere-egu22-6150, 2022.

15:16–15:22
|
EGU22-11855
|
ECS
|
Virtual presentation
Jorge Baño-Medina, Rodrigo Manzanas, Ezequiel Cimadevilla, Jesús Fernández, Antonio S. Cofiño, and Jose Manuel Gutiérrez

Deep Learning (DL) has recently emerged as a powerful approach to downscale climate variables from low-resolution GCM fields, showing promising capabilities to reproduce the local scale in present conditions [1]. There have also been some prospects assessing the potential of DL techniques to downscale climate change projections, in particular convolutional neural networks (CNNs) [2]. However, it is still an open question whether they are able to properly generalize to climate change conditions which have been never seen before and produce plausible results. 

Following the “perfect-prognosis” approach, we use in this study the CNNs assessed in [2] to downscale precipitation and temperature for the historical (1975-2005) and RCP8.5 (2006-2100) scenarios of  an ensemble of eight Global Climate Models (GCMs) over Europe. The resulting future projections, which are gathered in a new dataset called DeepESD, are compared with 1) those derived from benchmark statistical models (linear and generalized linear models), and 2) a set of state-of-the-art regional climate models (RCM) which are considered the “ground-truth”. Overall, CNNs lead to climate change signals that are in good agreement with those obtained from RCMs (especially for precipitation), which indicates their potential ability to generalize to future climates. Nevertheless, for some GCMs we find  that there are considerable regional differences between the “raw” and the downscaled climate change signals, an important aspect which was unnoticed in a previous work that focused exclusively on one single GCM [2]. This highlights the importance of considering  muti-model ensembles of downscaled projections (such as the one presented here) to conduct a comprehensive analysis of the suitability of DL techniques for climate change applications. Indeed, understanding the nature of the mentioned differences is necessary and future work towards this aim would imply carefully analyzing some of the assumptions made in“perfect-prognosis” downscaling (e.g., stationarity of the predictor-predictand link, adaptation of the statistical function to the climate model space). Therefore, following the FAIR (Findability, Accessibility, Interoperability and Reuse) principles we have made publicly available DeepESD through the Earth System Grid Federation (ESGF), which will allow the scientific community to continue exploring the benefits and shortcomings of DL techniques for statistical downscaling of climate change projections. 

References:

[1] Baño-Medina, J., Manzanas, R., and Gutiérrez, J. M.: Configuration and intercomparison of deep learning neural models for statistical downscaling, Geoscientific Model Development, 13, 2109–2124, 2020.

[2] Baño-Medina, J., Manzanas, R., and Gutiérrez, J. M.: On the suitability of deep convolutional neural networks for continental-wide downscaling of climate change projections, Climate Dynamics, pp. 1–11, 2021

 

Acknowledgements

The authors would like to acknowledge projects ATLAS (PID2019-111481RB-I00) and CORDyS (PID2020-116595RB-I00), funded by MCIN/AEI (doi:10.13039/501100011033). We also acknowledge support from Universidad de Cantabria and Consejería de Universidades, Igualdad, Cultura y Deporte del Gobierno de Cantabria via the “instrumentación y ciencia de datos para sondear la naturaleza del universo” project for funding this work. A.S.C and E.C. acknowledge project IS-ENES3 funded by the EU H2020 (#824084).



How to cite: Baño-Medina, J., Manzanas, R., Cimadevilla, E., Fernández, J., Cofiño, A. S., and Gutiérrez, J. M.: Testing deep learning methods for downscaling climate change projections: The DeepESD multi-model dataset, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-11855, https://doi.org/10.5194/egusphere-egu22-11855, 2022.

15:22–15:28
|
EGU22-614
|
ECS
|
Virtual presentation
Rocio Balmaceda-Huarte and Maria Laura Bettolli

Empirical statistical downscaling (ESD) under the perfect prognosis approach was carried out to simulate daily maximum (Tx) and minimum temperatures (Tn) in the different climatic regions of Argentina. In this regard, three ESD techniques: analogs (AN), generalized linear models (GLM) and neural networks (NN) were evaluated considering multiple predictor sets with a variety of configurations driven by three different reanalysis. ESD models were cross-validated with folds of non-consecutive years (1979-2014) and then evaluated in a warmer set of years ( 2015-2018). The focus of the assessment of the ESD models was put on some marginal and temporal aspects of Tx and Tn. Depending on the aspect analyzed, AN ,GLM or NN models were more/less skillful but no method fulfilled all the features of both predicand variables. In this sense, the predictor set and model configuration were key factors. The different predictor structures (point-wise, spatial-wise and combinations of them) introduced the main differences for each ESD method, regardless of the predictand variable, region and reanalysis choice. In addition, the differences observed in ESD models due to the reanalysis choice were notably lower than the ones obtained due to changes in the statistical family and model structure. In the case of predictor variables, no improvements were observed in Tx and Tn simulations when a more complex predictor set was considered. In the case of Tn, models’ skills considerably increased when humidity information was included in the predictor set.  Our results showed that downscaling models were able to capture the general characteristics of Tx and Tn in all regions, with better performance in the latter variable. Overall, promising results were obtained in the evaluation of the ESD models in Argentina which encourage us to continue exploring their potential in different applications. 

How to cite: Balmaceda-Huarte, R. and Bettolli, M. L.: Evaluating statistical downscaling for daily maximum and minimum temperatures in Argentina, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-614, https://doi.org/10.5194/egusphere-egu22-614, 2022.

15:28–15:34
|
EGU22-4717
|
ECS
|
On-site presentation
Mikel N. Legasa and Rodrigo Manzanas

Statistical downscaling (SD) methods are extensively used to provide high-resolution climate information based on the coarse outputs from Global Climate Models (GCM). In the context of climate change and under the perfect prognosis approach, these methods learn the relationships that link several large-scale predictor variables coming from a reanalysis (e.g. humidity) with the local variables of interest (e.g. precipitation) over a reference historical period. Subsequently, the so-learnt relationships are applied to GCM predictors to obtain downscaled projections for a future period.

In a recent paper, Legasa et al. (2021) introduced a posteriori random forests (AP-RFs), a modification of classical random forests which make use of all the data in the leaves to estimate any probability distribution. Following the experimental framework proposed in Experiment 1 of VALUE (http://www.value-cost.eu, Gutiérrez et al. 2018), the study showed that AP-RFs obtained reliable stochastic time-series over several locations in Europe using reanalysis predictors. As compared to more classical techniques like generalized linear models (GLMs), this study concluded that AP-RFs are a competitive SD method in terms of different forecast aspects, with one of their key advantages being the ability to automatically perform predictor/feature selection. This avoids the task of manually selecting the most adequate large-scale variables and geographical domain of interest, something which, at present, relies on human expertise and constitutes a substantial source of uncertainty for downscaling climate change projections.

Nevertheless, an assessment of the suitability of AP-RFs for producing local climate change projections from GCM predictors is still lacking. This work aims to fill this gap by providing a fair comparison of AP-RFs with GLMs and state-of-the-art convolutional neural networks (CNNs), which were recently shown to provide satisfactory results for this task (Baño-Medina et al. 2021). We build on VALUE’s Experiment 2a and train the different methods considered using ERA-Interim “perfect” predictors. Afterwards, the EC-Earth model is used to generate downscaled projections for 86 locations distributed across Europe under a strong emission scenario, the RCP8.5. 

Our preliminary results suggest that AP-RFs generate plausible downscaled future projections of precipitation. In particular, differently to traditional GLMs, which are very sensitive to the predictor set considered and may produce implausible climate change projections (Manzanas et al. 2020), this technique yields delta changes consistent with those obtained from both the raw EC-EARTH outputs and the CNNs.

References
Baño-Medina, J., Manzanas, R. & Gutiérrez, J.M. On the suitability of deep convolutional neural networks for continental-wide downscaling of climate change projections. Clim Dyn 57, 2941–2951 (2021). doi: https://doi.org/10.1007/s00382-021-05847-0

Gutiérrez, J.M., Maraun, D., Widmann, M. et al. An intercomparison of a large ensemble of statistical downscaling methods over Europe: Results from the VALUE perfect predictor cross-validation experiment. Int. J. Climatol. 2019; 39: 3750– 3785. doi:  https://doi.org/10.1002/joc.5462

Legasa M.N., Manzanas R., Calviño, A. et al. A Posteriori Random Forests for Stochastic Downscaling of Precipitation by Reliably Predicting Probability Distributions. Submitted to Water Resources Research.

Manzanas, R., Fiwa, L., Vanya, C. et al. Statistical downscaling or bias adjustment? A case study involving implausible climate change projections of precipitation in Malawi. Climatic Change 162, 1437-1453 (2020). doi:  https://doi.org/10.1007/s10584-020-02867-3

How to cite: Legasa, M. N. and Manzanas, R.: Assessing the Suitability of A Posteriori Random Forests for Downscaling Climate Change Projections, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-4717, https://doi.org/10.5194/egusphere-egu22-4717, 2022.

15:34–15:40
|
EGU22-9658
|
ECS
|
Virtual presentation
Tuuli Miinalainen, Harri Kokkola, Antti Lipponen, Kari E. J. Lehtinen, and Thomas Kühn

In this study, we present our findings for correcting global model-derived surface fine particulate mass (PM2.5) concentrations with a machine learning approach. We simulated the PM2.5 concentrations with an aerosol-climate model ECHAM-HAMMOZ, and trained a machine learning model to downscale the PM2.5 concentrations modelled for an Indian mega city, New Delhi. This way, we are able to utilize a global atmospheric model for analyzing aerosol emission mitigation effects on both Earth's energy budget and local air quality.

Similarly as with many other global-scale models, ECHAM-HAMMOZ underestimates surface PM2.5 at several urban locations. One apparent explanation for this is the coarse grid resolution of global climate models, which results in averaged aerosol concentrations over a much larger area than what urban cities typically cover. Therefore, due to averaging over a large grid box, the very high peak concentrations from urban areas can be evened out. Furthermore, the input fields describing aerosol emissions might lack information of some local emission sources, which can as well affect the simulated surface air pollution levels.

We used the random forest (RF) regression algorithm in order to downscale ECHAM-HAMMOZ-derived surface PM2.5 concentrations towards measured PM2.5 values from the New Delhi capital region in India. In addition, we applied the trained RF model to additional simulations where we had future anthropogenic aerosol emissions according to a business as usual scenario and a mitigation scenario. This allowed us to evaluate the effects of aerosol emission reductions on both global radiative balance, and on local air quality in New Delhi.

The obtained results indicate that surface PM2.5 concentrations from RF prediction correlate with the measured PM2.5 concentrations much better than the original ECHAM-HAMMOZ particulate concentrations for New Delhi region. However, with the current setup and input variables, the PM2.5 concentrations produced by the RF model seems to be lacking some of the short-term variations and very low and high values.

All in all, the downscaling method used in this project shows very promising potential, but requires further adjustment with the selection of input variables and the RF hyperparameters.

How to cite: Miinalainen, T., Kokkola, H., Lipponen, A., Lehtinen, K. E. J., and Kühn, T.: A machine learning approach for refining ECHAM-HAMMOZ-derived PM2.5, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-9658, https://doi.org/10.5194/egusphere-egu22-9658, 2022.

15:40–15:46
|
EGU22-11128
|
ECS
|
Virtual presentation
Tianyu Yue, Shuiqing Yin, and Hao Wang

High-quality snow is critical for the Winter Olympic Games. Snow quality is very sensitive to the changes of meteorological elements, especially temperature and humidity. Forecasts on snow quality can provide information for snow maintenance and require high-resolution meteorological forecasts. However, the complex terrain of the mountainous areas where the winter sports are often carried out could result in complex local meteorological fields, which makes it difficult to forecast. Zhangjiakou Competition Zone is one of the three competition zones for the Beijing 2022 Winter Olympics and includes two districts: Genting snow park and Guyangshu Ski Resort. Taking Genting snow park as an example, there is a difference of about 350m in altitude in Genting snow park which covers an area of about 2×2km2, and there is an average difference of about 3℃ in hourly temperature and about 10% in hourly relative humidity at noon.

Short-term forecasts in the past Winter Olympic Games were usually based on mesoscale NWP models with a horizontal resolution of up to 1×1km. Due to the limitation of boundary layer parameterization schemes, some small-scale air processes affected by local topography cannot be caught in mesoscale models. Some MOS methods can correct the systematic bias of the models but are unable to deal with the non-systematic errors caused by these small-scale processes.

The purposes of this study were to develop statistical downscaling methods for the temperature and humidity forecasts, which are required in the snow-quality risk classification for the Zhangjiakou Competition Zone of the Beijing 2022 Winter Olympics.  

Hourly data during 2018-2021 from 20 meteorological stations and ERA5-Land reanalysis in the study area were used for the calibration and validation of models. A decaying average method which is similar to the Kalman Filter method was applied to develop the downscaling models. To evaluate the efficiency of the models on snow-quality risk forecasts. A classification model of snow-quality risk developed by the Climate Centre of Hebei Province was applied. Snow-quality risk classification model was developed based on the four years’ meteorological and snow-quality observations in the study area, in which the risk of snow quality was classified into 4 levels: zero-risk, low-risk, medium-risk and high-risk based on the input temperature and humidity. The downscaled prediction fell into 3 cases: (1) the predicted risk level equal to the observed risk level (Accuracy); (2) the predicted risk level lower than the observed risk level (Miss); (3) the predicted risk level higher than the observed risk level (False-Alarm).

The results showed that: (1) the downscaling models can decrease the RMSE of the ERA5 by ~13% for the temperature and by ~14% for the dewpoint temperature; (2) the accuracy of the snow-quality risk classification increased from 72% to 76% on average comparing the inputs of ERA5 and the downscaled temperature and humidity. For the stations with high elevation, the ratio of False-Alarm decreased by ~13%. Further research will focus on improving the statistical model by calibrating the model for different locations and different circulation patterns.

How to cite: Yue, T., Yin, S., and Wang, H.: Statistical downscaling of temperature and humidity for snow-quality risk forecasts for Beijing 2022 Winter Olympics, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-11128, https://doi.org/10.5194/egusphere-egu22-11128, 2022.

Weather generators
15:46–15:52
|
EGU22-7485
|
Highlight
|
On-site presentation
Martin Dubrovsky, Ondrej Lhotka, Jiri Miksovsky, Petr Stepanek, and Jan Meitner

Stochastic weather generators (WGs) are tools for producing weather series, mimicking statistical properties of their real-world counterparts. They are often used in climate change impact experiments as a source of the data representing the present and/or future climates (alternative to RCMs and GCMs). Development of our SPAGETTA generator started in 2016 (Dubrovsky et al 2020; https://doi.org/10.1007/s00704-019-03027-z). The presentation will focus on (A) Basic details. (B) Functionalities of the generator. (C) Results obtained with the generator by now. (D) Most critical problems, which were met (and not yet satisfactorily solved) while making the generator fully operational.

A. SPAGETTA is a multivariate multisite parametric generator, which is based on autoregressive modeling (following the D. Wilks’ papers). It is designed mainly (but not solely) for use in agricultural and hydrological modeling. It may produce time series of up to 8 variables for as many as (approx.) 200 stations or grid points. Typically, it produces time series of temperature, precipitation, solar radiation, humidity and wind speed. It usually runs with a daily time step.

B. The main functionalities include: (1) It may produce arbitrarily long time series representing the climate defined by the data used for calibrating the generator (might be observational data or, for example, RCM outputs). (2) Having modified the WG parameters by the climate change scenario (typically derived from GCM or RCM simulations), SPAGETTA may produce weather series representing the future climate. In this case, one may study sensitivity of selected climatic indices to changes in various statistics (e.g. means and standard deviations of weather variables, and characteristics of temporal and spatial structure of the time series). (3) SPAGETTA may be interpolated so that it can produce weather series for sites with no observational data. (4) It can be linked with the circulation generator so that WG may better represent larger-scale (both in space and time) weather variability.

C. The results obtained with the generator by now include: (a) Validation of the generator in terms of WG parameters, various climatic indices, and outputs of hydrological model fed by the synthetic series produced by SPAGETTA. (b) Impacts of the forthcoming climate change on various climatic characteristics (RCM-based climate change scenarios were used here). Focus was put on spatial temperature-precipitation compound characteristics. (c) Validation of the interpolated generator. (d) Validation of the generator driven by the larger scale circulation generator.

D. Problems to be solved: (i) Under some circumstances (especially when a large number of the stations is used, or while interpolating the generator), matrices of the AR model imply unstable AR process which diverges to unrealistic values of weather variables. (ii) The generator underestimates the low frequency variability. Development of the larger scale circulation generator, which would eliminate this drawback, is still under development.

Only examples of the previous points will be shown in the presentation.

How to cite: Dubrovsky, M., Lhotka, O., Miksovsky, J., Stepanek, P., and Meitner, J.: The Spatial Weather Generator SPAGETTA: Hard Times of its Adolescence, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-7485, https://doi.org/10.5194/egusphere-egu22-7485, 2022.

Bias correction methods
15:52–15:58
|
EGU22-10974
|
ECS
|
Virtual presentation
Saurabh Kelkar and Koji Dairaku

Regional climate models (RCMs) are widely used to dynamically downscale the general circulation models (GCMs). Downscaled products can provide a clearer understanding of atmospheric processes compared to the parent models. However, several uncertainties are associated with downscaling, such as structural differences in climate models and biases in GCMs and RCMs. Post-processing methods such as univariate bias correction have been widely used to reduce the bias in the individual variable. However, these methods are applied to variables independently without considering the inter-variable dependence. In compound events such as heat stress, multiple drivers, surface air temperature (SAT), and relative humidity (RH) play crucial roles. Therefore, a multi-variable bias adjustment is necessary to retain the interdependence between the drivers for reliable information on heat stress. The present study focuses on a Multi-variable Bias Adjustment (MBA) method adapted from a topographical adjustment of SAT and RH and its impact on added values in a multi-model ensemble. We investigated added values and biases before and after adjusting the variables. There are gains and losses throughout the process of bias adjustment. Some added values show pseudo nature over some regions after the bias adjustment. Overall, the bias adjustment shows improvement in reducing bias over low-altitude urban areas, encouraging its application to assess heat stress.

How to cite: Kelkar, S. and Dairaku, K.: Investigation of added values in multi-model and multi-variable bias adjustments for heat stress assessment, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-10974, https://doi.org/10.5194/egusphere-egu22-10974, 2022.

15:58–16:04
|
EGU22-12499
|
ECS
|
Virtual presentation
Ezgi Akyuz, Burcak Kaynak, and Alper Unal

Climate change and air pollution are two phenomena that can no longer be considered separately. Changes in climate also alter the effects of air pollution. For example, emissions of ammonia (NH3) and non-methane organic compounds (NMVOC), which are precursors of ozone (O3) and secondary particles (PM), are drastically sensitive to temperature and humidity changes. Moreover, the impacts of O3 and secondary PMs on the climate were previously investigated. The first step for air quality modeling studies is the modeling of meteorological fields. In this study, important meteorological parameters in terms of air pollution were obtained from global climate models for a historical and future periods (SSP585). Selected parameters will be corrected and downscaled to a high resolution for Eastern Mediterranean. Then, the bias-corrected and downscaled meteorology outputs will be used in other studies related to air quality.

Countries in the Mediterranean Region are being affected significantly by the changing climate due to their location. Previously conducted studies evaluated the meteorological parameters of global climate models with low resolutions. Within the scope of this study, future estimates will be downscaled to a selected domain in Eastern Mediterranean with a spatial resolution of 4×4 km2 However, recent studies have argued that a bias-correction method should be implemented to the selected meteorological parameters prior to downscaling. In previous studies, CMIP simulation outputs were evaluated for Turkey with or without downscaling. There are also studies that biases between observation/reanalysis and GCM model data are calculated. However, according to our knowledge, evaluation of downscaled climate change scenarios in the Mediterranean Region using a bias-correction method has not been conducted yet. Here, a bias correction methodology (Xu et al. (2021)) was used, and an ensemble was generated by choosing appropriate global climate models which are compatible with reanalysis data for the selected region.

Native global climate model simulation results and non-linear long-term global climate model simulation trends were evaluated as the preliminary investigation. The temperature means of the global climate models (GCMs) and ERA5 reanalysis data were compared globally and for the EMEP domain. Initial findings showed underestimation or overestimation for the same GCM depending on the selected study domain. This result highlights the importance of the selection of the model for the study domain for weather generation and the models to be chosen for the ensemble. After calculating the long-term non-linear trend, the standard deviations were calculated for the interannual variability for the GCM and ERA5. For the historical period (1979-2014), annual temperature means of BCC-CSM2-MR, CMCC-CM2-SR5, EC-EARTH3, EARTH3-Veg, FIO-ESM-2-0, and KIOST-ESM showed similarity between ERA5 (r2 > 0.70). Summer and fall months show mostly higher correlations compared to other seasons. 22 model ensemble global domain (EMEP Domain) temperature mean, minimum and maximum values were found as 7.62 (6.09), 7.18 (5.28), and 8.12oC (6.89oC), respectively. The values for reanalysis data are 7.95 (6.92), 7.57 (5.94), and 8.27 oC (7.87oC).

How to cite: Akyuz, E., Kaynak, B., and Unal, A.: Downscaling GCM data using a bias-correction method for the Eastern Mediterranean., EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-12499, https://doi.org/10.5194/egusphere-egu22-12499, 2022.

16:04–16:10
|
EGU22-737
|
Highlight
|
Virtual presentation
Peter Berg, Thomas Bosshard, Wei Yang, and Klaus Zimmermann

Bias adjustment is the practice of statistically transforming climate model data in order to reduce systematic deviations from a reference data set, typically some sort of observations. There are numerous proposed methodologies to perform the adjustments -- ranging from simple scaling approaches to advanced multi-variate distribution based mapping. In practice, the actual bias adjustment method is a small step in the application, and most of the processing handles reading, writing and linking different data sets. These practical processing steps become especially heavy with increasing model domain size and resolution in both time and space. Here, we present a new implementation platform for bias adjustment, which we call MIdAS (MultI-scale bias AdjuStment). MIdAS is a modern code implementation that supports features such as: modern Python libraries that are suitable for large computing clusters, state-of-the-art bias adjustment methods based on quantile mapping, "day-of-year" based adjustments to avoid artificial discontinuities, and also introduces cascade adjustment in time and space. The MIdAS platform has been set up such that it will support development of methods aimed at higher resolution climate model data, explicitly targeting cases where there is a scale mismatch between data sets. In this presentaton, we describe the MIdAS assumptions and features, and present results from the main evaluation of the method for different regions around the world. We also present the most recent development of MIdAS towards different parameters.

How to cite: Berg, P., Bosshard, T., Yang, W., and Zimmermann, K.: MIdAS---MultI-scale bias AdjuStment, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-737, https://doi.org/10.5194/egusphere-egu22-737, 2022.

16:10–16:16
|
EGU22-4305
|
Presentation form not yet defined
Muralidhar Adakudlu, Elena Xoplaki, Heiko Paeth, Chibuike Ibebuchi, and Daniel Schoenbein

Daily precipitation and temperature simulated by regional climate models carry large systematic biases owing to multiple factors including inadequate model resolution and limitations in the parameterization of important processes. Reduction of these biases is a crucial process in rendering the model information more reliable for climate change and hydrological assessments. We present an evaluative study of bias correction of daily precipitation and temperature from an ensemble of regional climate models from the EUR-11 CORDEX domain (CLMCOM-CCLM4, GERICS-REMO15, SMHI-RCA4, DMI-HIRHAM5, and CanRCM4 driven by MPI-ESM). This is an important milestone within a larger framework of the RegiKlim consortium towards generating high-resolution bias corrected and statistically downscaled fields for providing useful climate information in specific areas in Germany. A quantile delta mapping (QDM) approach is applied to adjust the biases in the distribution characteristics of precipitation and temperature. The delta factor, derived from the ratio of the projected value of a given quantile to that of the present value, is applied to the standard transfer function so that the modelled climate change signal can be preserved. High-resolution (0.1°) gridded dataset from the German Weather Service, DWD-HYRAS, is used as the reference for bias correcting the variables. The impact of the bias adjustment on important parameters such as the number and frequency of wet/dry and cold/hot spells are quantified. The response of the quantile mapping method to the seasonal variations in the dominant driving processes is further investigated. 

How to cite: Adakudlu, M., Xoplaki, E., Paeth, H., Ibebuchi, C., and Schoenbein, D.: Providing useful local climate information through statistical bias correction, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-4305, https://doi.org/10.5194/egusphere-egu22-4305, 2022.

New data sets
16:16–16:22
|
EGU22-2720
|
On-site presentation
Jean-Philippe Vidal, Pere Quintana Seguí, and Santiago Beguería

Climate projections downscaled with statistical methods often focus on precipitation (P) and temperature (T) variables, which is not sufficient to run offline land surface models (LSMs) and derive hydrological projections based on both water and energy budgets.

This work extends a proposition by Clemins et al. (2019) to build on a multivariate high-resolution reanalysis dataset to infer projected ancillary variables from P & T projections based on analogue resampling. The refined method is a multisite and multivariate resampling method based on analogy of spatially distributed P & T daily anomalies. Anomalies are derived with respect to a baseline monthly climatology. The method thus makes use of spatial and multivariate consistency available in the archive reanalysis to supplement projections for additional variables and for a possibly extended spatial domain. The baseline climatology is considered as linearly transient for temperature variables to deal with anomalies not experienced during the archive period, and large-scale additional transient changes in T are passed on ancillary variables based on present-day anomaly relationships.

The new proposed method is exemplified for the Greater Pyrenean Region (GPR) defined as all basins draining the Pyrenees mountain range and extending over France, Spain, and Andorra. The archive reanalysis used here is the 2.5 km gridded SAFRAN-PIRAGUA surface reanalysis for the Pyrenees over 1965-2005 derived from existing SAFRAN reanalyses over France (Vidal et al., 2010) and Spain (Quintana-Seguí et al., 2017).

The projections considered here are 6 CMIP5 GCMs run under both RCP4.5 and RCP8.5 previously downscaled and including only P, TN, and TX variables and not available north of the Pyrenees (Amblar Francés et al., 2020). Applying the resampling method over the GPR led to 2.5 km gridded projections of daily time series of all variables required by LSMs. Results show – on top of an increasing temperature and a decreasing precipitation over the 21st century – a decrease in wind speed, relative humidity, and infrared radiation, and an increase in visible radiation and evapotranspiration. These projections come with a large inter-GCM dispersion and more pronounced changes under RCP8.5

This work was funded by the Interreg V-A POCTEFA 2014-2020 through the EFA210/16 PIRAGUA project.

 

Amblar-Francés, M. P., Ramos-Calzado, P., Sanchis-Lladó, J., Hernanz-Lázaro, A., Peral-García, M. C., Navascués, B., Dominguez-Alonso, M., Pastor-Saavedra, M. A. & Rodríguez-Camino, E. (2020) High resolution climate change projections for the Pyrenees region. Advances in Science and Research, 17, 191-208

Clemins, P. J., Bucini, G., Winter, J. M., Beckage, B., Towler, E., Betts, A., Cummings, R. & Chang Queiroz, H. (2019) An analog approach for weather estimation using climate projections and reanalysis data. Journal of Applied Meteorology and Climatology, 58, 1763-1777

Quintana-Seguí, P., Turco, M., Herrera, S. & Miguez-Macho, G. Validation of a new SAFRAN-based gridded precipitation product for Spain and comparisons to Spain02 and ERA-Interim (2017) Hydrology and Earth System Sciences, 21, 2187-2201

Vidal, J.-P., Martin, E., Franchistéguy, L., Baillon, M. & Soubeyroux, J.-M. A 50-year high-resolution atmospheric reanalysis over France with the Safran system (2010) International Journal of Climatology, 30, 1627-1644

How to cite: Vidal, J.-P., Quintana Seguí, P., and Beguería, S.: Beyond the usual suspects P&T: deriving multivariate high-resolution transient forcings for land surface models, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-2720, https://doi.org/10.5194/egusphere-egu22-2720, 2022.

16:22–16:28
|
EGU22-3769
|
ECS
|
Virtual presentation
Fabian Lehner and Herbert Formayer

Gridded climatological data derived for Bhutan from internationally available data sets either come at very low spatial resolution, do not provide daily data or contain few in-situ measurements. We present a newly developed daily high-resolution (1 km) gridded data set for Bhutan covering the years 1996 to 2019 for precipitation and for maximum and minimum temperature (BhutanClim) and compare it with state-of-the-art global observation data sets. As input, we used quality controlled and homogenized data from up to 67 weather stations from the National Center for Hydrology and Meteorology of Bhutan (NCHM).

The spatial interpolation method of temperature is based on methods that have already been successfully implemented in Austria, Switzerland and Germany. It allows non-linear lapse rates and considers geographical obstacles in the interpolation. The new climatology benefits from the use of local measurements and shows plausible small scale spatial patterns. Compared to other available state-of-the-art data sets BhutanClim there are new features especially in the precipitation field: Some valleys in central Bhutan border a dry climate classification according to Köppen-Geiger.

How to cite: Lehner, F. and Formayer, H.: Daily 1 km gridded temperature and precipitation in Bhutan , EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-3769, https://doi.org/10.5194/egusphere-egu22-3769, 2022.

16:28–16:34
|
EGU22-882
|
ECS
|
On-site presentation
Han Su, Barbara Willaarts, Diana Luna Gonzalez, Maarten S. Krol, and Rick J. Hogeboom

Over 608 million farms exist around the world, occupying 36.9% of global land and 72% of annual freshwater withdrawals. These farms are highly diverse and heterogeneous. More than 80% of them are smaller than 2 hectares and they only utilize around 20% of farmlands but support millions of livelihoods in the rural area. Many datasets are available describing the global crops, soil conditions, and water availability. A few of them are farm-size specific. There is a lack of a global overview on how the soil-water-agriculture system is different across farm sizes.
This study aims to explore how soil, water, and crop change along with farm sizes. Specifically, we used the current best available databases on cropland extent, farm size structure, crop-specific harvested area, and field size distribution to develop a gridded, farm-size specific, and crop-specific harvested area map for 56 countries, representing half global cropland, using a downscaling algorithm. The resulting maps were validated by empirical data and compared to previous similar studies. We then overlapped the farm-size specific, and crop-specific map with global soil and water scarcity maps to explore differences between small and large farms on planted crops, soil nutrient availability, and level of water scarcity.
Our results show, in comparison to larger farms, smaller farms plant more pulses, roots and tubers, vegetables, and fewer oilcrops, sugar crops, and fodder crops. The majority of small farms do not have severe limitations on soil nutrient availability, but they do face severe water scarcity. Large farms, on the other hand, do not confront severe water scarcity but do face severe limitations on soil nutrient availability. Small farms may also be less capable of adapting to water scarcity through irrigation. However, there is still spatial variation in our results. Our results can help to further differentiate the sustainable soil, water, and agriculture management for small and large farms. 

How to cite: Su, H., Willaarts, B., Luna Gonzalez, D., S. Krol, M., and J. Hogeboom, R.: How soil, water, and crop change along with farm sizes: insights from a new high-resolution farm-size specific and crop-specific map covering 56 countries, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-882, https://doi.org/10.5194/egusphere-egu22-882, 2022.

16:34–16:40
|
EGU22-3709
|
On-site presentation
Tibor Racz

The Hellmann rainfall recorders have been one of the primary instruments of the rainfall intensity measurement, mainly in the countries of the central part of Europe, during the 20th century. These water level measurement-based rainfall recorders ensure the continuity of measurement by periodically emptying siphoned measurement cylinder. During the emptying, the measurement is paused, resulting in under-measurement. The duration and number of emptying can be known, and the correction of under-measurement can be adjusted if the registration ribbon is available. In the case of historical data, the registration ribbon is often unavailable, and only extracted values of the most intensive periods of rainfalls can be gained from databases. A procedure for correcting such kinds of data is shown, and also the influence of the correction of Hellmann-Fuess recorder’s data on the IDF curves.

The procedure is based on the method published in 2002 by Luyckx and Berlamont, which was compiled based on the physical process of the siphoning, and it assumed the existence of the original records (ribbons), with the exact time of siphoning. The extracted data tables nor time nor number of siphoning was registered, so the method of Luyckx and Berlamont for the data correction cannot be used. The proposed procedure’s principal is the estimation of the number and length of pauses in the extracted measurement interval. These estimated data make passible to fix the rainfall quantity and intensity for the given interval. The measurement cylinder of the device is generally not empty at the beginning of the most intensive period of rainfall. The water level is assumed as a uniformly distributed probability variable what can be estimated with its expected value. The raw data comprise underestimation in all cases, so the raw data represent only the possible minima of the plausible intensity values. The fixed data result in the expected value of the plausible intensities, which are sometimes higher and sometimes lower than the actual intensity values, with the same probability; so, if there are a high number of fixed data, the positive and negative deviations diminish the errors statistically. The use of the correction formula is presented with the parameters of a Hellmann-Fuess rainfall writer. The correction range has been the highest over ten years of average recurrence, and its measure was 10%. For 30 minutes and longer sampling intervals, the magnitude of the correction is not relevant. After that, the correction is based on statistical estimation, the exact value of the rainfall intensity cannot be retrieved, but the underestimation can be decreased significantly. The fixed data modify the IDF curves as well, and in this way, the effect of climate change can be investigated more appropriately.

How to cite: Racz, T.: Correction of siphoning error in processed historical rainfall intensity data, a case study of data measured by Hellmann-Fuess type rainfall recorder , EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-3709, https://doi.org/10.5194/egusphere-egu22-3709, 2022.