- 1Luxembourg Institute of Science and Technology
- 2Université de Mons
Electrical Resistivity Tomography (ERT) is being increasingly applied to support hydrogeological studies by providing high-resolution images that delineate geological structures, groundwater resources, or soil moisture variability. More recently developed as a monitoring tool, ERT provides time-series data of groundwater content over surface areas of hundreds of square meters, complementing point-based monitoring approaches. ERT systems can be deployed permanently, collecting high-spatial resolution data at varying time frequencies, from days to sub-hourly acquisitions, and delivering near-real-time information. Advances in petrophysical relationships allow changes in electrical resistivity to be converted into calibrated soil moisture models, which feed hydrological models.
In addition to water content, temperature is another critical factor affecting electrical resistivity measurements, with impacts as strong as 2% changes in resistivity per °C in rock or soil materials. Isolating signals caused by hydrological processes in ERT time-series requires assessing subsurface temperature changes and correcting for their effects on resistivity. There is no strong consensus on how to handle these issues in processing ERT monitoring experiments. Sinusoidal 1D models representing seasonal temperature variations are commonly applied to estimate subsurface temperatures. These models define phase lags and damping of air temperature at depths using a damping factor, calibrated with in-situ data from vertical temperature profiles. While simple and independent of measured temperature time-series, these models may introduce biases when air temperature does not follow a sinusoidal seasonal pattern. Alternative approaches include interpolating data from temperature sensors at different depths, or applying heat transfer modelling in 1D, 2D or 3D.
However, a remaining challenge lies in the way the temperature models are being used to correct the resistivity models. Several approaches have been proposed, and in general, resistivity values of each cell of resistivity models is simply corrected using corresponding temperature model values. Such approaches don’t take the ERT data resolution linked with the measurement sequence into account, and the effect this may have on the sensitivity of the measurement to temperature changes. Typically for applications with large or small electrode spacings, or arrays with varying spacings, this approach may introduce relatively large artefacts on the corrected resistivity models. Here, we propose a new correction strategy based on estimating the sensitivity of the ERT dataset to temperature changes via implementing forward modelling onto a temperature corrected homogeneous 1-ohm resistivity model. We compare existing and novel approaches for subsurface temperature modelling and trial our innovative approach on a real ERT dataset acquired in field conditions.
How to cite: Watlet, A., Kaufmann, O., and Gourdol, L.: Temperature correction of electrical resistivity models in time-laspe ERT experiments: towards the integration of model resolution, EGU General Assembly 2025, Vienna, Austria, 27 Apr–2 May 2025, EGU25-21714, https://doi.org/10.5194/egusphere-egu25-21714, 2025.