Continuous and simultaneous measurement of triple oxygen and hydrogen isotopes of liquid and vapour during evaporation
- University of Cambridge, Godwin Lab, Department of Earth Sciences, United Kingdom (mpb68@cam.ac.uk)
Here, we describe a system for measuring triple oxygen and hydrogen isotopic ratios of both the liquid and vapour during evaporation of water in a dry gas stream (N2 or dry air) at constant temperature and relative humidity (RH). The hardware consists of a polymer glove box (COY), peristaltic pump (Ismatec), and Picarro L2140-i cavity ring-down laser spectrometer (CRDS) with Standard Delivery Module (SDM). Liquid water from the evaporation pan is sampled via a closed recirculating loop and syringe pump that delivers a constant rate of water to the vaporizer, maintaining a constant concentration of water vapour in the cell (20,000 ±103, 1 s.d.) over an injection cycle. Liquid measurements alternate with vapour from the glove box which is introduced to the CRDS using a diaphragm gas pump. Important for high-precision measurements, both cavity pressure and outlet valve stability are maintained throughout the liquid injection and subsequent vapour phase. Experiments are bookended by two in-house standards which are calibrated to the SMOW-SLAP scales. An additional drift corrector is introduced periodically.
To test the precision and stability of the liquid injections, we sampled from an isotopically homogeneous volume of water and introduced it to the cavity over a period of ~48h. To minimise the standard deviation derived from noise, we chose an optimum integration time of ~2000s (~33 minutes) based on σAllan minimisation. Therefore, for combined liquid-vapour experiments we use an injection/vapour sampling window of 40-minutes (140ug water is consumed per injection), which provides a data collection period of 33-minutes after a 7-min waiting time for equilibration.
Across a single liquid injection, the mean standard error for d17O, d18O, and dD is 0.008‰, 0.007‰, and 0.02‰, respectively. For the vapour phase equivalent, the mean standard error for d17O, d18O, and dD is 0.01‰, 0.009‰, 0.03‰ respectively. For the d-excess in the liquid and the vapour across one 33-minute cycle, the standard error is 0.07‰ and 0.08‰, respectively. For the O17-excess in the liquid and the vapour across one 33-minute cycle, the standard error is 6 per meg and 8 per meg, respectively.
A single evaporation experiment produces in excess of 100,000 measurements each of d17O, d18O, and dD for both the evaporating liquid and resulting vapour. These measurements result in 95% confidence limits for the slope of ln(d17O+1) vs ln(d18O+1) of ±0.0002 and ±0.0003 for the liquid and vapour, respectively. For the slope of ln(dD+1) vs ln(d18O+1) we obtain a 95% confidence interval of ±0.001 and ±0.002 for the liquid and vapour, respectively. The experimental method permits measurement of fractionation of triple oxygen and hydrogen isotopes of water under varying experimental conditions (e.g., RH, temperature, turbulence) at very high precision. It will be useful for testing numerical models of evaporation and conducting experiments to simulate evaporation and isotopic equilibration in natural systems. An application to closed-basin lakes will be presented.
How to cite: Brady, M. and Hodell, D.: Continuous and simultaneous measurement of triple oxygen and hydrogen isotopes of liquid and vapour during evaporation, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-16872, https://doi.org/10.5194/egusphere-egu2020-16872, 2020.