Data quality of large dense seismic networks – lessons learnt from AlpArray and application to AdriaArray
- 1Institute of Geophysics of the Czech Academy of Sciences, Prague, Czech Republic
- 2Christian-Albrechts-Universität zu Kiel, Institut für Geowissenschaften, Kiel, Germany
- 3Department of Geophysics, Faculty of Science, University of Zagreb, Zagreb, Croatia
- 4www.alparray.ethz.ch
- 5https://orfeus.readthedocs.io/en/latest/adria_array_main.html
Large dense seismic networks popping up around the world in the last two decades enable studying the wave propagation and structure of the Earth with unprecedented details. Hundreds of broadband seismic stations spaced by tens of kilometers produce large amounts of data, which is usually processed by automatic routines. The data is no longer supervised by seismologists on a detailed level of every record as thousands of hours of data are handled at once. Ensuring the quality of data and accompanying metadata is nowadays a discipline by and of itself. Besides the classical techniques, which investigate the properties of data at a single station, large dense seismic networks allow for a multi-station approach to review the quality of the data. The diagnostic tools of multi-station methods are based on the detection of outlying stations/records among many others. Properties of the wavefield of wavelengths longer than the station spacing vary smoothly and hence comparing the measurement at neighboring stations allows for identifying anomalous behaviors. These methods work under the assumption that most of the (meta-) data is correct, and therefore a small number of outliers can be detected. Thus, not only do large dense seismic networks contribute to research, which is their primary goal, but thanks to the design of these networks, data quality can also be tested more precisely than before. We review both types of techniques, showing examples of the AlpArray experiment (2015 - 2022), and discussing the development of the approaches over the years to what is nowadays applied to the AdriaArray Seismic Network. AdriaArray experiment started in 2022 and encompasses now around 1000 permanent as well as over 430 temporary broadband stations. We show how the availability and retrievability of the data are checked, how amplitude and phase information from the ambient and deterministic wavefields are used to assess the correctness of metadata and how we represent the results of these tests in maps and tables. The purpose of these tests is aimed in both directions: towards the users, so that they are aware of potential issues, as well as towards the station operators so that they can be notified and asked to fix the detected problems.
How to cite: Kolínský, P., Vecsey, L., Stampa, J., Eckel, F., Belinić Topić, T., Working Group, T. A., and Seismology Group, T. A.: Data quality of large dense seismic networks – lessons learnt from AlpArray and application to AdriaArray, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-12520, https://doi.org/10.5194/egusphere-egu24-12520, 2024.
Comments on the supplementary material
AC: Author Comment | CC: Community Comment | Report abuse