- 1Helmholtz Centre Potsdam - GFZ, Seismic Hazard and Risk Dynamics, Berlin, Germany (laruelle@gfz.de)
- 2Institute of Geosciences, University of Potsdam, 14476 Potsdam, Germany
- 3Technical University Munich, 80333 Munich, Germany
- 4Institute of Applied Geosciences, TU Darmstadt, 64287 Darmstadt, Germany
- 5Institute for Applied Geosciences, TU Berlin, 10587 Berlin, Germany
- 6Rocks Expert SARL, 244 chemin de Bertine, 04300 St. Maime, France
- 7NAGRA, National Cooperative for the Disposal of Radioactive Waste, 5430 Wettingen, Switzerland
Geomechanical-numerical modelling aims to provide a comprehensive characterization of the stress tensor within rock volumes by leveraging localized stress magnitude data for model calibration. This calibration involves optimizing boundary conditions to achieve the closest alignment with in-situ stress measurements in boreholes that provide magnitudes of the minimum and maximum horizontal stress. However, the high cost of acquiring stress magnitude data frequently results in sparse and incomplete datasets which potentially prevents a meaningful calibration.
In this study, we use a comprehensive stress magnitude dataset of 50 stress magnitude data records acquired for the geomechanical characterization of the candidate siting region Zürich Nordost for a deep geological repository located in northern Switzerland. We demonstrate how the size of the calibration dataset influences the accuracy and uncertainty of stress magnitude predictions in geomechanical modelling of sedimentary formations. We introduce a novel statistical approach that incrementally increases the size of calibration data subsets. This approach evaluates how the amount of available data influences stress predictions across formations with varying rock stiffness. It achieves this by rapidly assessing the stress states associated with a large number of different combinations of stress magnitude data. The comparison of the resulting stress fields with increasing number of calibration point data allows to estimate the minimum number of calibration points that are required to achieve a stress prediction range that is as small as the range expected due to inherent uncertainties in the data. The results show that less than 20 data points are sufficient to achieve the same model precision and accuracy.
Furthermore, a detailed analysis of the dataset revealed a data outlier linked to a local stiffness anomaly. This outlier significantly impacts the stress predictions when calibration data are limited. However, as the calibration dataset size increased, the influence of the outlier diminishes. We also show that our statistical approach allows for the objective identification of clear outliers with respect to the model in the calibration dataset, which has an impact on the minimum number of data needed for the model calibration.
These findings underscore the significance of dataset size and composition in reducing uncertainties, thereby providing a framework for optimizing calibration strategies. This study offers valuable insights for subsurface projects, such as energy storage, CO2 sequestration, deep geological repositories, or geothermal energy, where precise stress predictions are critical.
How to cite: Laruelle, L., Ziegler, M., Reiter, K., Heidbach, O., Desroches, J., Giger, S., and Cotton, F.: Minimum amount of stress magnitude data for reliable geomechanical modelling, EGU General Assembly 2025, Vienna, Austria, 27 Apr–2 May 2025, EGU25-6868, https://doi.org/10.5194/egusphere-egu25-6868, 2025.