EGU23-5252
https://doi.org/10.5194/egusphere-egu23-5252
EGU General Assembly 2023
© Author(s) 2023. This work is distributed under
the Creative Commons Attribution 4.0 License.

Hybrid generation based on machine learning to enhance numerical simulation for earthquake

Gottfried Jacquet, Didier Clouteau, and Filippo Gatti
Gottfried Jacquet et al.

In the last decades, geophysicists have developed numerical simulators to predict earthquakes and other natural catastrophes. However, the more precise the model is, the higher the computational burden and the time to results. In addition, even if we could reproduce the phenomenon with more complex and more representative models, the underlying uncertainty would remain significantly high, affecting the reliability of the final prediction. In response to this challenge, we adopted a hybrid strategy, consisting into mixing physics-based numerical simulations and machine-learning. The goal is to transform synthetic earthquake ground motion, obtained via physics-based simulation, accurate up to a frequency of 5 Hz, into a broader-band prediction that mimics the recorded seismographs. In doing so, we factorize the latent representation of the seismic signal, by forcing an encoding that splits features into two parts: a low frequency one (0-1 Hz) and a high frequency one (1-20 Hz). In the following, we train a convolutional U-Net neural network and apply two different signal-to-signal translation techniques: pix2pix and BiCycleGAN. The latter strategies are compared with the prior work of Gatti et al., 2020, on the Stanford Earthquake Dataset (STEAD) showing their capability of mimicking recorded seismographs. We finally tested the two strategies on the synthetic time-histories obtained for the 2019 Le Teil earthquake (France).

 
  

How to cite: Jacquet, G., Clouteau, D., and Gatti, F.: Hybrid generation based on machine learning to enhance numerical simulation for earthquake, EGU General Assembly 2023, Vienna, Austria, 24–28 Apr 2023, EGU23-5252, https://doi.org/10.5194/egusphere-egu23-5252, 2023.