Fast Earthquake Magnitude Estimation using HR-GNSS time series: a Deep Learning approach
- 1Frankfurt Institute for Advanced Studies, Frankfurt am Main, Germany
- 2Institute of Geosciences, Goethe University Frankfurt, Frankfurt am Main, Germany
- 3Institute of Theoretical Physics, Goethe University Frankfurt, Frankfurt am Main, Germany
Fast magnitude estimation of large earthquakes has been a key task for warning systems. In the last decades, Global Navigation Satellite Systems data with high-rate sampling (≥1 Hz; HR-GNSS) have provided us with useful information from displacement time series for analyzing large earthquakes; especially when the signals of earthquakes recorded in inertial sensors are saturated. Hence, improving algorithms to contribute to the fast analyses of the HR-GNSS data has been a recent challenge.
In this work, we propose a deep-learning-based algorithm for earthquake magnitude estimation, which was trained by thousands of synthetic displacement time series corresponding to Mw > 6.5 earthquake signals. We adapted the model to a variable number of stations and lengths of the time series as input. Thus, it is possible to apply the algorithm without any restriction on the number of stations, and the flexibility in the length of the input time series facilitates the inclusion of data not only from local stations but also from regional stations if required. The influence of attributes such as noise, magnitude, number of stations, epicentral distance, and length of input time series on the model performance was evaluated. We aim to generalize this approach to the magnitude estimation of earthquakes from different tectonic regions. The robustness of the model was tested with both synthetic and real earthquake signals.
How to cite: Quinteros-Cartaya, C., Köhler, J., Faber, J., Li, W., and Srivastava, N.: Fast Earthquake Magnitude Estimation using HR-GNSS time series: a Deep Learning approach, EGU General Assembly 2023, Vienna, Austria, 24–28 Apr 2023, EGU23-9248, https://doi.org/10.5194/egusphere-egu23-9248, 2023.