EGU21-8986, updated on 10 Jan 2024
EGU General Assembly 2021
© Author(s) 2024. This work is distributed under
the Creative Commons Attribution 4.0 License.

Influence of calibration period length on predictions of evaporation

Chengcheng Gong1,2,4, Wenke Wang1,2, Zaiyong Zhang1,2, Harrie-Jan Hendricks Franssen3, Fabien Cochand4, and Philip Brunner1,4
Chengcheng Gong et al.
  • 1Key Laboratory of Subsurface Hydrology and Ecological Effects in Arid Region, Chang’an University, Ministry of Education, P. R. China (
  • 2School of Water and Environment, Chang’an University, Ministry of Education, P. R. China
  • 3Agrosphere (IBG-3), Forschungszentrum Jülich GmbH, Jülich, Germany
  • 4Center for Hydrogeology and Geothermics (CHYN), University of Neuchâtel, Switzerland

Bare soil evaporation is a key component of the soil water balance. Accurate estimation of evaporation is thus critical for sustainable water resources management, especially in arid and semi-arid regions. Numerical models are widely used for estimating bare soil evaporation. Although models allow exploring evaporation dynamics under different hydrological and climatic conditions, their robustness is linked to the reliability of the imposed parameters. These parameters are typically obtained through model calibration. Even if a perfect match between observed and simulated variables is obtained, the predictions are not necessarily reliable. This can be related to model structural errors, or because the inverse problem is ill-posed. While this is conceptually very well known, it remains unclear how the temporal resolution and length of the employed observations for the calibration influence the reliability of the parameters and the predictions.

We used data from a lysimeter experiment in the Guanzhong Basin, China to systematically explore the influence of the calibration period length on the calibrated parameters and uncertainty of evaporation predictions. Soil water content dynamics and water level were monitored every 5 minutes. We set up twelve models using the fully coupled, physically-based HydroGeoSphere model with different calibration period lengths (one month, three months, six months, fourteen months). The estimated evaporation rates by the models for the calibration period and validation period were compared with the measured evaporation rates. Also, we predict cumulative, one-year evaporation rates. The uncertainty of the predictive evaporation by these models from different calibration lengths is quantified. Several key conclusions can be drawn as follows: (1) The extinction depth is a very important parameter for the soil water content dynamics in the vadose zone but is poorly informed unless the calibration includes significantly different depths to groundwater. (2) Using the longer calibration period length (six months or fourteen months) did not necessarily result in more reliable predictions of evaporation rates. (3) Preliminary results indicate that the uncertainty can be reduced if the calibration period includes both climatic forcing similar to the prediction, but additionally also feature similar water table conditions during calibration and prediction. Our results have implications for reducing uncertainty using unsaturated-saturated models to predict evaporation.

How to cite: Gong, C., Wang, W., Zhang, Z., Hendricks Franssen, H.-J., Cochand, F., and Brunner, P.: Influence of calibration period length on predictions of evaporation, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-8986,, 2021.