Seismic Data Compression and Telemetry Bandwidth Considerations for Earthquake Early Warning
- Nanometrics, Ottawa, Canada (michaellaporte@nanometrics.ca)
Earthquake early warning systems depend on the prompt, reliable arrival of seismic data at network data centers. Network operators invest significant resources into the design, installation and operation of real-time acquisition systems to ensure maximum data completeness and minimum data latency, to allow EEW processing modules to detect events and issue warnings as quickly as possible.
These mission-critical acquisition systems must perform before, during and after earthquakes, as main shocks are frequently preceded by foreshocks and followed by aftershocks, which are often just as dangerous. As such, a key consideration in the design of these systems is the impact that large earthquakes may have. Seismic data is generally encoded using Steim compression, which is a first difference algorithm. During large events the differences between samples grow, requiring more bits to record and, thus, increasing the data volume. This results in a surge in the throughput required during large events. System designers and network operators must be fully aware of this effect and plan for it accordingly.
This study expands on existing work to further characterize the impact of large events on seismic data compression and the corresponding spikes in throughput which must be supported by real-time acquisition systems. The study will examine the relationship between compression and various factors, including station magnitude, hypocentral distance, sample encoding technique, packet size, sample rate and system sensitivity.
How to cite: Laporte, M., Perlin, M., Jusko, M., and Easton, D.: Seismic Data Compression and Telemetry Bandwidth Considerations for Earthquake Early Warning, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-12248, https://doi.org/10.5194/egusphere-egu24-12248, 2024.