Over the last years, significant progress has been made towards understanding spatio-temporal correlations of earthquake occurrence, scaling laws, earthquake clustering, and the emergence of seismicity patterns. Background and clustered seismicity occur with great spatio-temporal variability. New models being developed in statistical seismology and pattern recognition have direct implications for time-dependent seismic hazard assessment, probabilistic earthquake forecasting and for analyzing the evolution of seismicity clusters. In many regions where complex fault systems exist, clusters are characterized by multiple mainshock sequences, with large aftershocks, which increase the overall hazard.
In this session, we invite researchers to present their latest results and insights on the physical and statistical models (either theoretical or based on laboratory and numerical experiments on rock fracture and friction) for the occurrence of earthquakes, foreshocks and aftershocks. Particular emphasis will be placed on:

- physical and statistical models of earthquake occurrence;
- analysis of earthquake clustering;
- spatio-temporal properties of earthquake statistics;
- quantitative testing of earthquake occurrence models;
- implications for time-dependent hazard assessment;
- methods for earthquake forecasting;
- data analyses and requirements for model testing;
- pattern recognition in seismology;
- machine learning applied to seismic data.

Confirmed solicited speaker: Ilya Zaliapin (University of Nevada, Reno, USA)

Co-organized by SM1
Convener: Stefania Gentili | Co-conveners: Rita Di Giovambattista, Álvaro GonzálezECSECS, Filippos Vallianatos
| Attendance Mon, 04 May, 14:00–15:45 (CEST)

Files for download

Session materials Download all presentations (102MB)

Chat time: Monday, 4 May 2020, 14:00–15:45

Chairperson: Stefania Gentili, Rita Di Giovambattista, Álvaro González
D2012 |
Peidong Shi, Léonard Seydoux, and Piero Poli

Monitoring and investigating the physical states of active faults is essential to understand how earthquakes begin and the physical processes involved. Traditionally, fault-state investigation strategies use seismic catalogs whose completeness and accuracy may be limited. We propose to take benefit of the information encoded in the continuous seismograms in order to fully extract information about the fault physics. We calculate the covariance matrix spectrum of continuous seismograms at an array of stations and extract features (e.g. entropy, spectral width, variance and coherency) from the covariance matrix eigenvalue spectrum. Those features are related to seismic source characteristics (e.g. source localization, spectral content, duration...) in the time scale of analysis, and can be used to reveal different physical states of faults. The dominant frequency band of the seismic wavefield changes at different stages of fault activities. Therefore, we perform clustering to characterize the physical states of fault based on the extracted frequency-dependent features. We apply this approach to investigate the 2009 L’Aquila earthquake. At preparation phase of the L’Aquila earthquake, foreshocks are localized around the main active fault. In contrast, the aftershocks disperse in a more broaden area where the faults have been activated by the mainshock. The extracted features and corresponding clustering results are able to capture and distinguish those patterns of earthquake distribution. In addition, the locations of the seismic sources are encoded in the covariance matrix eigenvectors. Through clustering and migration of eigenvectors, we are able to reveal the spatial and temporal variation of the different seismic sources. The method is here applied to study recent earthquakes in Italy as the L’Aquila 2009, Emilia 2012 and Norcia 2016.

How to cite: Shi, P., Seydoux, L., and Poli, P.: Direct fault states assessment from wavefield properties: application to the 2009 L’Aquila earthquake, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-11655, https://doi.org/10.5194/egusphere-egu2020-11655, 2020.

D2013 |
Cataldo Godano

We have investigated the dependence of the Gutenberg-Richter b parameter on the Moho depth h in many areas of the world. We have found that b increases with h. This observation has be interpreted in terms of aftershocks occurrence. Indeed aftershocks are generated by the stress released by the  afterslip occurred i n the ductile zone beneath the brittle one. The depth of the Moho has been, here, used as an indicator of the coupling between the brittle and the ductile zones. As h increases the coupling increases generating more aftershocks. These are characterized by an higher b value leading to our observation.

How to cite: Godano, C.: The b variability with the Moho depth and the link between aftershocks and afterslip, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-9205, https://doi.org/10.5194/egusphere-egu2020-9205, 2020.

D2014 |
Giuseppe Petrillo, François Landes, Eugenio Lippiello, and Alberto Rosso

The organization in time, space and energy of aftershocks is characterized by scaling behaviors with exponents which are quite universal. At the same time, deviations from the universal behavior are sometimes observed and they have been proposed as a tool to discriminate aftershock from foreshock occurrence. Here we show that the change in rheological behavior of the crust with increasing depth, from velocity weakening to velocity strengthening, represents a viable mechanism to explain statistical features of both aftershocks and foreshocks. More precisely, we present a model of the seismic fault described as a velocity weakening elastic layer elastically coupled to a velocity strengthening visco-elastic layer. The model has only two parameters: one controls the degree of heterogeneities of the static friction force and the other quantifies the stress transferred between the two layers. We show that the statistical properties of aftershocks in instrumental catalogs are recovered at a quantitative level without any fine-tuning. This robustness provides a justification for the universality of aftershock phenomenological laws and supports our modelling assumptions. We also find that synthetic foreshocks mimic those observed in instrumental catalogs, opening the way for subtle forecasting tools.

How to cite: Petrillo, G., Landes, F., Lippiello, E., and Rosso, A.: The influence of the brittle-ductile transition zone on aftershock and foreshock occurrence, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-4676, https://doi.org/10.5194/egusphere-egu2020-4676, 2020.

D2015 |
Matteo Taroni and Aybige Akinci

In this study we present five- and ten-year time-independent forecast of M≥5.0 earthquakes in Italy using only seismicity data, without any tectonic, geologic, or geodetic information. Spatially-varying earthquake occurrence rates are calculated using an adaptive smoothing kernel (Helmstetter et al., 2007) that defines a unique smoothing distance for each earthquake epicenter from the distance to the n-th nearest neighbor, optimized through the Collaboratory for the Study of Earthquake Predictability (CSEP) testing type likelihood methodology (Werner et al.,2007). We modify that adaptive smoothing method to include all earthquakes in the catalog (foreshocks, aftershocks and the events below the completeness magnitude) multiplying each smoothing kernel by a proper scaling factor that varies as function of completeness magnitude and the number of events in each seismic cluster. Our smoothing philosophy relies on the usefulness of all earthquakes, including also those with smaller magnitudes, in forecasting the future seismicity.
The smoothed seismicity Italian model, that provides the forecasted seismicity rates as an expected number of M≥5.0 events per year in each grid cell, 0.1°x0.1°, is constructed by using the complete instrumental catalog, spanning from 1960 to 2019 with a completeness magnitude that decreases with time (from M4.0 to 1.8). Finally, we compare our model with the real observations and with the Italian CSEP experiment models, to check their relative performances, using the official CSEP tests (Taroni et al., 2018). In the present study, the probabilities of occurrence of future large earthquakes in the next 5 and 10 years are calculated based on the assumption that earthquake processes have no memory, i.e., the occurrence of a future earthquake is independent of the occurrence of previous earthquakes from the same source (time-independent model).

How to cite: Taroni, M. and Akinci, A.: Forecasting M≥5.0 earthquakes in Italy using a new adaptive smoothing seismicity approach , EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-22578, https://doi.org/10.5194/egusphere-egu2020-22578, 2020.

D2016 |
Zakaria Ghazoui, Jean-Robert Grasso, Arnaud Watlet, Corentin Caudron, Abror Karimov, Sebastien Bertrand, Yusuke Yokoyama, and Peter van der Beek

Seismology and paleoseismology seem to be two distant sisters when we address earthquake time-interval distributions. One observation stands out; an apparent discrepancy in time-interval models, i.e. periodic to cluster, within similar tectonic context. As a departure point, we will use the Himalayan context where according to instrumental or paleoseismic catalogues, time-interval distributions are presented as Poisson to periodic. We report on a new 6000-year lake-sediment seismic record and perform statistical analyses to show that time intervals between large (M≥6.5) earthquakes are robustly described by a Poisson distribution, while second-order fluctuations imply event clustering. These patterns are calibrated against an instrumental catalogue for the entire Himalaya; we show that both catalogues are inconsistent with periodic models. Throughout this presentation, we will compare the Himalayan results with paleoseismic catalogues from three distinct tectonic settings (Indonesia, New-Zealand and Jordan). Each of them displays a close to Poisson distribution, in consonance with instrumental catalogues results. Our results imply that the occurrence of major seismic events is as uncertain as smaller events on any time scale, increasing drastically previous estimate of the seismic hazard.

How to cite: Ghazoui, Z., Grasso, J.-R., Watlet, A., Caudron, C., Karimov, A., Bertrand, S., Yokoyama, Y., and van der Beek, P.: Does the seismic cycle slip toward randomness?, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-14510, https://doi.org/10.5194/egusphere-egu2020-14510, 2020.

D2017 |
Renata Rotondi and Elisa Varini

Physics-based models focus on the generation process of individual earthquakes but the strong space-time interaction existing among the events of seismic sequences requires that the seismic activity be studied as a whole through statistics-based models in order to forecast its future trend. Properties such as long-range interactions, power law distributions, fractal geometries are common to all complex systems and are also shared by earthquakes and fault systems. Over recent years, on the one hand many studies have shown the inability of classical statistical mechanics to treat complex systems in exhaustive manner, and, on the other hand, the application of the Tsallis entropy Sq - a generalization of the classical Boltzmann-Gibbs entropy in non-extensive sense – has led to long-tailed power-law distributions typical of complex systems (Vallianatos et al. 2018); it seems hence that the non-extensive statistical physics can offer an appropriate framework of investigation for complex phenomena. In this work we follow this approach giving a detailed statistical treatment of its application to Italian earthquake sequences covering a period of some years; each data set was partitioned in moving time windows.

Given a continuous variable X with probability distribution f(X), by maximizing the Tsallis entropy under appropriate constraints, such as the generalized expectation value and the normalization constant, it turns out that f(X) is a q-exponential distribution. The q entropic index can assume positive values less or larger than 1: in the former case the system is in a super-additive state and f(X) is defined on a finite domain depending on model parameters, in the latter case the system is in a sub-additive state and f(X) is defined on R+. Through a variable transformation required by the fragment-asperity model for earthquake generation, one derives the probability distribution of the magnitude from the two versions of f(X). Following the Bayesian approach we have estimated the parameters by generating random samples from the posterior distributions through the Metropolis-Hasting algorithm; moreover, in each time window, we have evaluated the Tsallis entropy and compared the performance of the two versions of the magnitude distribution in terms of marginal posterior likelihood. The temporal variations of the q-index and of the entropy Sqcan be helpful in identifying in which dynamics regime the system is, and therefore in improving our ability to forecast seismicity evolution. Some of the results achieved partially disagree with those present in the literature (Vallianatos et al. 2018); what seems reasonable is to consider the change of one of these variables, rather than a specific trend, as index of a phase change of the physical system.


Vallianatos F., Michas G. and G. Papadakis (2018) Nonextensive statistical seismology: An overview, from: Complexity and Seismic Time Series. Measurement and Application, eds. Chelidze T., Vallianatos F., Telesca L., Elsevier, 25-59

How to cite: Rotondi, R. and Varini, E.: Variations in the temporal evolution of seismicity pointed out by non-extensive statistical physics approach, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-10098, https://doi.org/10.5194/egusphere-egu2020-10098, 2020.

D2018 |
Ilya Zaliapin and Yehuda Ben-Zion

We attempt to track and quantify preparation processes leading to large earthquakes using two complementary approaches. (a) Localization of brittle deformation manifested by evolving fractional volume with seismic activity, and (b) Coalescence of earthquakes into clusters. We analyze seismicity catalogs from Southern California (SoCal), Parkfield section of the San Andreas Fault (SAF), and region around the 1999 Izmit and Duzce earthquakes in Turkey.

Localization of deformation is estimated using the Receiver Operating Characteristic (ROC) approach. Specifically, we consider temporal evolution of the fractional volume 0 ≤ V(q) ≤ 1 occupied by the fraction 0 ≤ q ≤ 1 of active voxels with mainshocks. We also consider the localization of the spatial intensity of mainshocks within a sliding time window with respect to the time-averaged distribution, quantified by Gini coefficient G. The significance of the results is assessed using reshuffled catalogs. Analysis within the rupture zones of large earthquakes indicate decrease of V(q) and increase of G (increased localization) prior to the Landers (1992, M7.3), El Mayor-Cucapah (2010, M7.2), Ridgecrest (2019, M7.1), and Duzce (1999, M7.2) mainshocks. We also observe ongoing damage production by the background seismicity around these rupture zones several years before their occurrences. In contrast, we observe increase of V(q) and decrease of G prior to the Parkfield (2004, M6.0) mainshock in the creeping section of the SAF. Next, we examine the quasi-linear region in the Eastern part of Southern California around the Imperial fault, Brawley seismic zone, southern SAF and Eastern California Shear Zone. We document four cycles of background localization, measures by V(q) and G, well aligned in time with the largest events in the region: Landers, Hector Mine, El Mayor-Cucapah, and Ridgecrest. The coalescence process is represented by a time-oriented graph that connects each earthquake in the examined catalog to all earlier earthquakes at the earthquake nearest-neighbor proximity below a specified threshold. We examine the size of the clusters that correspond to low thresholds, and hence represent active clustering episodes. We document increase of the average cluster size prior to the Landers, El Mayor-Cucapah, Ridgecrest and Duzce mainshocks, and decrease of the average cluster size prior to the Parkfield mainshock.

The results of our complementary localization and coalescent analyses consistently indicate progressive localization of damage prior to the largest earthquakes on non-creeping faults and de-localization on the creeping Parkfield section of SAF. These findings are consistent with analysis of acoustic emission data. The study is a step towards developing methodology for analyzing the dynamics of seismicity in relation to preparation processes of large earthquakes, which is robust to spatio-temporal fluctuations associated with aftershock sequences, data incompleteness and common catalog errors.

How to cite: Zaliapin, I. and Ben-Zion, Y.: Quantifying preparation process of large earthquakes: Damage localization and coalescent dynamics, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-12056, https://doi.org/10.5194/egusphere-egu2020-12056, 2020.

D2019 |
Hugo Sanchez-Reyes, David Essing, Eric Beauce, and Piero Poli

Our knowledge about the physics behind the initiation process of large or small earthquakes remains limited. The current understanding of this process suggests that an earthquake occurs when increasing stress causes a pre-existing fault to fail suddenly (e.g. Dieterich 1992). Models such as the pre-slip instability growth or the triggered cascade of events have been proposed in order to theoretically explain this preparation stage (Dodge and Beroza, 1996; Ellsworth and Bullut, 2018; Bouchon et al., 2011). However, the mechanisms behind this process are still unknown. This debate is mainly due to the lack of direct observations of the subsurface shear stress evolution at the area of interest before and after an earthquake.

Considering that the shear stress evolves through time until the moment of failure, indirect observations of this change might be available but hidden inside the continuous seismic data. In this work, we analyze in detail the evolution of the seismic activity of a small (Mw 4.4) normal fault earthquake which occurred in Central Italy on 7th November 2019 at the middle lower crust (16 km depth). We first analyze the available continuous data using the Fast Matched Filter (Beauc\'e et al., 2017). Then, every new detected event is spatially localized with respect to the other events through the Double Difference algorithm (DD). As a result, we obtain the spatio-temporal evolution of the foreshock and aftershock sequences of that event.

The results from this analysis shed light on the patterns that the shear-stress spatio-temporal evolution follows before and after a given event. Therefore, we expect that this study will contribute to improve our understanding of the physics behind the earthquake initiation process.

How to cite: Sanchez-Reyes, H., Essing, D., Beauce, E., and Poli, P.: A detailed study of the initiation process of a small (Mw4.4) normal fault earthquake in the middle lower crust, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-11661, https://doi.org/10.5194/egusphere-egu2020-11661, 2020.

D2020 |
Yijian Zhou, Shiyong Zhou, Hao Zhang, Yu Hou, Weilai Pei, Longtan Wang, and Naidan Yun

Xiaojiang Fault (XJF) lies at the southeastern edge of the rhombic Sichuan-Yunnan block, and has an extent for over 400km from Qiaojia to Shanhua district. The Sichuan-Yunnan block experiences clockwise rotation and southwestward escaping from the Tibetan Plateau, producing complex fault geometry and seismicity pattern. The strong variation along fault segments provides a special opportunity to study the relationship between fault zone properties and seismicity pattern. However, the fine structure of XJF remains unknown due to the sparse observational stations.

Seismic data has its unique advantage of resolving fault zone properties at depth. We deployed 48 broad-band seismometers along XJF in order to capture detailed seismicity patterns. Our seismic network covers the northern and middle part of XJF, with an average aperture of 20km; the continuous observation from 2015 to 2019 guarantees enough data volume. We detected about 12,000 earthquakes by STA/LTA phase picking and association, and augmented the detection to over 50,000 events with template matching. The relocated catalog has lateral and vertical resolution of 500m and 1km, respectively; the magnitude of completeness (Mc) reaches ML0.3

This high-resolution catalog depicts detailed 3D fault geometry. The seismicity shows clustered lateral distribution, with the clusters’ depth extension ranging from 20km at northern to 35km at southern segments. Unmapped orthogonal faults on northern XJF are illuminated by seismicity, which matches orthogonal topography characteristics. Repeating events are detected from 8 seismicity clusters, under a threshold of 5 repeating families, indicating a creeping slip mode, while the separated low-seismicity segments exhibit a high locking rate. Taking advantage of the high detectability, we got reliable b-value estimation for different segments of XJF. The low-b regions correlate well with the margins of locking patches, which points to a high stress concentration. Velocity structure extracted from ambient noise and fault zone head wave present similar spatial variation, which further proved the seismicity pattern. The high heterogeneous characteristics of XJF may produce stress barriers, preventing future earthquake rupture from propagating to a large scale. 

How to cite: Zhou, Y., Zhou, S., Zhang, H., Hou, Y., Pei, W., Wang, L., and Yun, N.: Stress State and Fault Strength Variation along Xiaojiang Fault Zone Revealed by Seismicity, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-3535, https://doi.org/10.5194/egusphere-egu2020-3535, 2020.

D2021 |
Rubén García-Hernández, Luca D'Auria, José Barrancos, and Germán D. Padilla

The estimation of the spatial and temporal variations of the b-value of the Gutenberg-Richter law is of great importance in different seismological applications. However, its estimate is strongly dependent upon the selected spatial and/or temporal scale due to the heterogeneous distribution of the seismicity. This is especially relevant in volcanic and geothermal areas where dense clusters of earthquakes often overlap to the background seismicity.

For this reason, we propose a novel multiscale approach allowing a consistent estimation of the b-value regardless of the considered spatial and/or temporal scales. Our method, named MUST-B (MUltiscale Spatial and Temporal estimation of the B-value), basically consists in computing estimates of the b-value at multiple temporal and spatial scales, extracting for a given spatio-temporal point a statistical estimator of its value, as well as an indication of the characteristic spatio-temporal scale. This approach includes also a consistent estimation of the completeness magnitude (Mc) and of the uncertainties over both b and Mc, as well as, estimates of the seismic energy release rates.

We applied this method to the seismic datasets of El Hierro submarine eruption, started on October 2011 and linked to a precursor seismic unrest episode that initiated on July 2011. The seismicity showed a very complex spatial distribution, which also changed over time, showing a migration from the north of the island to the south. Results show that the high resolution 4D mapping is of great importance to understand the distribution of the seismic energy release in volcanic islands, which is possibly correlated to a variable geothermal fluid flow paths and/or magmatic sources. What is also remarkable is that even in highly heterogeneous catalogues, as for the 2011 El Hierro dataset, the MUST-B method could provide reliable estimates.

How to cite: García-Hernández, R., D'Auria, L., Barrancos, J., and Padilla, G. D.: 4D imaging of the seismic energy release before the 2011 El Hierro eruption (Canary Islands), EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-5009, https://doi.org/10.5194/egusphere-egu2020-5009, 2020.

D2022 |
Yifan Yin, Stefan Wiemer, Edi Kissling, Federica Lanza, and Bill Fry

Crustal earthquakes in low deform rate regions are rare in the human life span but bear heavy losses when occurring. Limited observations also hinter robust earthquake forecasts. In this study, we use a high-resolution catalog to investigate the triggering of the 2010-2011 Canterbury earthquake sequence, New Zealand. The seismic sequence occurred in the North Canterbury Plains, a low-stress, low-seismicity region relatively close to active plate boundaries where large earthquakes are frequent, such as the 2009 MW 7.8 Dusky Sound Earthquake. To map the post-seismic stress transfers of remote large events acting in the region, we calculate the temporal and spatial seismic rate changes in the crust from 2005 to the 2010 Mw 7.1 Darfield Earthquake, the first mainshock of the Canterbury sequence. We use template matching analysis to obtain a new high-resolution seismic catalog that includes events previously undetected by routine network monitoring. Detection quality is further established through the usage of a Support Vector Machine classifier. Using the new catalog, we observe a seismic quiescence on the North Canterbury Plain between Dusky Sound Earthquake and the Darfield Earthquake. The quiescence is accompanied by a reduced rate in micro-seismicity, suggesting a lowered b-value in the region primed for the Canterbury sequence. The lack of proof of dynamic or static triggering suggests that complex fault interactions lead to the onset of the Darfield Earthquake.

How to cite: Yin, Y., Wiemer, S., Kissling, E., Lanza, F., and Fry, B.: Seismic rate change as a tool to investigate remote triggering of the 2010-2011 Canterbury earthquake sequence, New Zealand, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-4051, https://doi.org/10.5194/egusphere-egu2020-4051, 2020.

D2023 |
Rosastella Daminelli and Alberto Marcellini

The negative exponential distribution of the magnitude (that is the well-known Gutenberg-Richter relation) and the negative exponential distribution of interarrival times constitute the backbone of the seismic hazard analysis.

Our goal is to check if these two distributions could be considered an acceptable model also for aftershock sequences.

We analysed several aftershock sequences, with mainshocks ranging from M=5.45  to M=7.3; six sequences of Californian earthquakes selected from the SCEC database and an Italian sequence, selected from INGV-CNT Catalog.

The results show that the G-R relation fits remarkably the data, with a β value ranging from -1.8  to -2.4. The temporal behaviour shows an acceptable fit to the negative exponential distribution:  all the sequences exhibit a good fit for Δt>2.5 hours, on the contrary for Δt<2.5 hours Weibull distribution is more suitable.

How to cite: Daminelli, R. and Marcellini, A.: Statistical Behaviour of Time Occurrences and Magnitude of Aftershock Sequences, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-10773, https://doi.org/10.5194/egusphere-egu2020-10773, 2020.

D2024 |
Álvaro González, Isabel Serra, and Álvaro Corral

Earthquakes cluster in time in a tighter way than in a Poisson process, in which events would be independent from each other and from when each one occurred. This tight clustering should be considered for forecasting the probability of ocurrence of earthquakes in a time period.

Nevertheless, the standard analysis of temporal earthquake occurrence usually proceeds by “declustering” the earthquake time series, trying to identify aftershocks or other triggered events and then pruning them from the sample, leaving only the supposedly independent events. This procedure attempts to artificially make the process Poissonian-like (so that the probability of the next earthquake is forced to be constant in time).

Since there is not a unique way of identifying triggered earthquakes, this removal is subjective to some degree (it involves lack of knowledge about the process, that is, epistemic uncertainty). Such a method also reduces the sample itself, reducing the power of any statistical inference made with it (in other words, with fewer events it is more difficult to distinguish which model best fits the data).

An example of this issue is the debate on whether the recent surge of great earthquakes (magnitude 8 or larger) since 2004 is random or not. If they were Poissonian, the distribution of time intervals between them should be exponential. The answer may depend on whether triggered events are artificially removed from the sample or not.

In this research, we explore in a comprehensive way the statistical distribution of time intervals between consecutive earthquakes worldwide. We use a complete earthquake catalogue, and do not attempt to separate triggered from independent events.

We consider different magnitude thresholds, and for each of them test which statistical distribution (such as Weibull, gamma or exponential) best fits the data. This enables us to quantitatively assess whether there is a universal distribution or, on the contrary, if it depends on the magnitude range considered. Also, we test whether Poissonian occurrence can be rejected for the whole series of the largest earthquakes.

Finally, we show how this distribution, calibrated for each magnitude range, can be used for calculating probabilities of earthquake occurrence in a time period of interest, in a more realistic way than typically achieved.

How to cite: González, Á., Serra, I., and Corral, Á.: The global statistical distribution of time intervals between consecutive earthquakes, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-13309, https://doi.org/10.5194/egusphere-egu2020-13309, 2020.

D2025 |
Filippos Vallianatos

Boltzmann-Gibbs (BG) statistical physics is one of the cornerstones of contemporary physics. It establishes a remarkably useful bridge between the mechanical microscopic laws and macroscopic description using classical thermodynamics. If long-range interactions, non-markovian microscopic memory, multifractal boundary conditions and multifractal structures are present then another type of statistical mechanics, than BG, seems appropriate to describe nature (Tsallis, 2001).

To overcome at least some of these anomalies that seem to violate BG statistical mechanics, non-extensive statistical physics (NESP) was proposed by Tsallis  (Tsallis, 1988) that recovers the extensive BG as a particular case. The associated generalized entropic form controlled by the entropic index  q that represents a measure of non-additivity of a system. Sq recovers SBG in the limit q→1. For a variable X with a probability distribution p(X), as that of seismic moment , inter-event times  or distances between the successive earthquakes or the length of faults in a given region, using terms of NESP, we obtain the physical probability which expressed by a q-exponential function as defined in Tsallis, (2009).  Another type of distributions that are deeply connected to statistical physics is that of the squared variable X2. In BG statistical physics, the distribution of X2 corresponds to the well-known Gaussian distribution. If we optimize Sq for X2, we obtain a generalization of the normal Gaussian that is known as q-Gaussian distribution (Tsallis, 2009). In the limit q→1, the normal Gaussian distribution, recovered. For q> 1, the q-Gaussian distribution has power-law tails with slope -2/(q-1), thus enhancing the probability of the extreme values.

In the present work we review a collection of Earth physics problems such as a) NESP pathways in earthquake size distribution, b) The effect of mega-earthquakes, c) Spatiotemporal description of Seismicity, d) the plate tectonics as a case of non-extensive thermodynamics e) laboratory seismology and fracture, f) the non-extensive nature of earth’s ambient noise, and g) evidence of non-extensivity in eartquakes’ coda wave. The aforementioned cases cover the most of the problems in Earth Physics indicated that non extensive statistical physics could be the underline interpretation tool to understand earth's evolution and dynamics.

We can state that the study of the non-extensive statistical physics of earth dynamics remains wide-open with many significant discoveries to be made. The results of the analysis in the cases described previously indicate that the ideas of NESP can be used to express the non-linear dynamics that control the evolution of the earth dynamics at different scales. The key scientific challenge is to understand in a unified way, using NESP principles, the physical mechanisms that drive the evolution of fractures ensembles in laboratory and global scale and how we can use measures of evolution that will forecast the extreme fracture event rigorously and with consistency.

 Acknowledgments. We acknowledge support by the project “HELPOS – Hellenic System for Lithosphere Monitoring” (MIS 5002697) which is implemented under the Action “Reinforcement of the Research and Innovation Infrastructure”, funded by the Operational Programme "Competitiveness, Entrepreneurship and Innovation" (NSRF 2014-2020) and co-financed by Greece & European Union (ERDF).


How to cite: Vallianatos, F.: A non-extensive statistical physics view in Earth Physics: Geodynamic properties in terms of Complexity theory ., EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-3749, https://doi.org/10.5194/egusphere-egu2020-3749, 2020.

D2026 |
Gianfranco Cianchini and the SAFE Team

Earthquakes, the most energetic phenomena in the lithosphere, often cause danger and casualties: thus, their study and comprehension are greatly worth doing because of the obvious importance for society. Geosystemics intends to offer a way to study the Earth system by viewing it as a whole, looking at the possible couplings among the different geo-layers, i.e., from the earth’s interior up to the ionosphere through the atmosphere. It uses specific universal tools to integrate different methods that can be applied to multi-parameter data, often taken on different platforms (e.g., ground, marine or satellite observations). Its main aim is to understand the particular phenomenon of interest from a holistic point of view. Central is the use of entropy, together with other physical quantities that are introduced case by case. In this paper, we will deal with earthquakes, as final part of a long-term chain of processes involving, not only the interaction between different components of the Earth’s interior but also the coupling of the solid earth with the above neutral or ionized atmosphere, and finally culminating with the main rupture along the fault of concern. Particular emphasis will be given to some Italian seismic sequences.

How to cite: Cianchini, G. and the SAFE Team: Geosystemics View of Earthquakes, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-22588, https://doi.org/10.5194/egusphere-egu2020-22588, 2020.

D2027 |
Vladimir Kossobokov, Polina Schepalina, and Anastasia Nekrasova

Understanding the cyclic and other forces governing geodynamics may provide fundamental clues for unraveling characteristics of earthquakes occurrence, which remains spectacular evidence induced by plate tectonics fueled by tidal drag and associated global cooling of the Earth (Riguzzi et al., 2010; Doglioni and Panza, 2015).

To check the hypotheses of earthquake-preferred days the nonparametric Kuiper test statistics for cyclic variations applied to the seismic evidence resulting from the empirical distributions of the earthquake origin time versus solar (Julian Day, JD) or lunar (Moon Phase, MP) cycles. We present the results of the Kuiper test application to seismicity of the Lake Baikal and Yúnnán-Sichuan Regions aimed at verification on a solid statistical base the hypotheses of uniform distribution of earthquake origin time JD’s and MP’s in respect to the earthquake magnitude cut-off.

How to cite: Kossobokov, V., Schepalina, P., and Nekrasova, A.: Earthquake preferred days in the Lake Baikal and Yunnan-Sichuan Regions, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-720, https://doi.org/10.5194/egusphere-egu2020-720, 2020.

D2028 |
Polyzois Bountzis, Tasos Kostoglou, Vasilios Karakostas, and Eleftheria Papadimitriou

Earthquake clustering investigation reveals characteristics of the earthquake dynamics, like the evolution of main shock-aftershock sequences and swarms. For such investigation we applied a method based on a bivariate stochastic point process, the Markovian arrival process (MAP) [Neuts, 1979], (Nt,Jt)t∈ℜ+, whose intensity function, λt, is modulated by a latent Markov process, Jt. Each hidden state corresponds to a distinct seismicity rate of the counting process, Nt, enabling the modeling of the fluctuations between triggered and background seismicity, as well as clustering evolution. We assume that the physical mechanisms governing earthquake clustering are unknown and the prevailing parameter to separate the background seismicity from seismic excitations is the seismicity rate. The consistency of the identified clusters is evaluated on the seismicity of the area of central Ionian, comprising Lefkada and Kefalonia Islands. This is an active boundary characterized by remarkably high seismic activity, with frequent strong mainshocks (M≥6.0) and a dense monitoring network. The method is applied on a recent highly accurate relocated earthquake catalog with a low completeness magnitude as well as to an earthquake catalog of longer time duration, from 2008 to 2017, including well studied and very productive aftershock sequences.


This research is co-financed by Greece and the European Union (European Social Fund- ESF) through the Operational Programme «Human Resources Development, Education and Lifelong Learning 2014-2020» in the context of the project “Kinematic properties, active deformation and stochastic modelling of seismogenesis at the Kefalonia - Lefkada transform zone” (MIS-5047845).

How to cite: Bountzis, P., Kostoglou, T., Karakostas, V., and Papadimitriou, E.: A study of earthquake clustering in central Ionian Islands through a Markovian Arrival Process, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-4788, https://doi.org/10.5194/egusphere-egu2020-4788, 2020.

D2029 |
Stefania Gentili and Rita Di Giovambattista

In this study, we have applied to northeastern Italy and western Slovenia medium-low seismicity an algorithm for strong aftershock forecasting we originally developed for medium-high seismicity in Italy (Gentili and Di Giovambattista, 2017). The method, called NESTORE (Next STrOng Related Earthquake), analyzes the seismicity after medium and strong earthquakes, in order to forecast if a subsequent large earthquake (SLE) will follow. A SLE following a main shock can cause significant damages to already weakened buildings and infrastructures, therefore a timely advisory information to the civil protection is of great interest for effective decision-making. We performed the analysis on different time-spans after the mainshock, in order to simulate the increase of information available as time passes during the seismic clusters. NESTORE subdivides the clusters of seismicity into two classes: “A” if the difference in magnitude between the mainshock and the strongest aftershock is lower than 1, and B otherwise. Several statistical features based on time-space and energy evolution of seismicity are analyzed separately and, if they are sufficiently informative for SLE forecasting, they are used for independent decision trees training. The results are merged together by a Bayesian approach, obtaining a time-dependent probability Prob(A) to have an A cluster, i.e. to have at least one SLE. This study is possible thanks to the OGS bulletins, an accurate local catalogue, characterized by low completeness magnitude, compiled by the National Institute of Oceanography and Experimental Geophysics. We tested the method on 1976 highly destructive Friuli cluster (mainshock magnitude 6.5 — the strongest in the last 80 years in the region) and on two small clusters of seismicity in NE Italy in 2019, obtaining encouraging results: 6 hours after the mainshock, for two A clusters NESTORE supplies Prob(A)=98% while for the B one Prob(A)=2%.

How to cite: Gentili, S. and Di Giovambattista, R.: How strong will be the following earthquake?, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-8184, https://doi.org/10.5194/egusphere-egu2020-8184, 2020.

D2030 |
Rita Di Giovambattista, Giovanna Calderoni, and Antonio Rovelli

We present the results of Brune stress drop (∆σ) and apparent stress (τa) variability of  earthquakes located in a small zone adjacent to the hypocenter of the damaging Mw 6.1 L'Aquila earthquake. Their magnitude ranges between  2.7 and 4.1. Interevent variability of stress drop and apparent stress results in a factor of 10, well beyond the individual‐event uncertainty. Radiation efficiency ηsw = τa/∆σ varies mostly between 0.1 and 0.2, but decreases in the days immediately before and after the main shock to values as low as 0.06. This may be related to the migration of the events occurring in those days into a focal volume with higher dynamic strength. The temporal change of ηsw might be interpreted as a spatial variation due to the earthquake migration into the locked portion of the fault originating the main shock. Furthermore, no variation in stress drop and apparent stress can be observed between foreshocks and aftershocks but the smallest and largest ∆σ result in a good correlation with the largest and smallest b‐values respectively, as already documented in literature in the rupture nucleation volume of large earthquakes.

How to cite: Di Giovambattista, R., Calderoni, G., and Rovelli, A.: Spatio-temporal variations of source parameters in the nucleation zone of the 6 April 2009, Mw 6.1 L'Aquila Earthquake, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-22596, https://doi.org/10.5194/egusphere-egu2020-22596, 2020.

D2031 |
Jordi Baro, Joern Davidsen, and Álvaro Corral

Material failure at different scales and processes can be modeled as an emergent feature in terms of avalanche dynamics in micromechanical systems. 
Event-event triggering -or aftershocks- is common in seismological catalogs and acoustic emission experiments [1] among other phenomena.
Stochastic branching and linear Hawkes processes are used to model the statistical properties of catalogs.  In the micromechanical approach, viscoelastic stress transfer and after-slip are among the proposed mechanism of aftershocks. Here we ask this simple question: 'Do aftershock sequences in micromechanical models agree with such epidemic branching paradigm?'

We introduce two fibrous models as prototypes of viscoelastic fracture [2] which (i) provides an analytical explanation to the acceleration of activity in absence of critical failure observed in acoustic emission experiments [3]; (ii) reproduce the typical spatio-temporal properties of triggering found in field catalogs, acoustic emission experiments; but (iii) display discrepancies with the branching topological properties predicted by stochastic models [4], probably due to physical constrains.

[1] J. Baró et al., Phys. Rev. Lett. 110 (8), 088702 (2013).
[2] J. Baró, J. Davidsen, Phys. Rev. E  97 (3), 033002 (2018).
[3] J. Baró, et al., Phys. Rev. Lett. 120 (24), 245501 (2018).
[4] S. Saichev, et al., Pure and App. Geoph. 162 (6), 1113-1134 (2005).

How to cite: Baro, J., Davidsen, J., and Corral, Á.: Topological properties of aftershock clusters in a viscoelastic model of quasi-brittle failure, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-4948, https://doi.org/10.5194/egusphere-egu2020-4948, 2020.

D2032 |
Markos Avlonitis, Dimitrios Kotinas, and Eleftheria Papadimitriou

The role of asperities in fault evolution has been received continuously increasing attention as critical areas where nucleation and cascade like failure may take place. They consist patches where the contact takes place across the fault rough surfaces, accumulating elastic strain during the interseismic period. More than one asperity rupture result to strong and large earthquakes, a phenomenon mostly characterizing large subduction earthquakes. Identification of the factors controlling single or multiple asperities failure and their spatiotemporal behaviour is a key issue in seismic hazard assessment. It is the aim of the present work to explore the role of different spatial patters of asperities as well as their different strength characteristics by means of simulation experiments via cellular automata models.  Initial results show that the earthquake distribution clearly depends on a) the total real contact area of asperities, b) the relative distance between asperity patches and c) the fraction of strain that asperities may sustain in comparison to the corresponding value of the non–asperity sites. Τhere is a definite range of the aforementioned controling parameters, which result to a non–typical earthquake magnitude distribution and where a clear departure from the classical power law–like Gutenberg – Richter relation is depicted. More specifically, for one (more than one well separated) asperity (–ies) with significant fraction of strain unlocking thresholds a non–typical earthquake size distribution emerges where for low magnitude earthquakes a power law still holds, but for higher earthquake sizes, a quantum like behaviour emerges, i.e. there is one (more than one) certain earthquake sizes that are more probable to occur. This manifests a characteristic earthquake model, which although not adequately supported by observational data, is present in several applications of simulator models.


asperities, large earthquake quantification, characteristic model, cellular automata

«Telemachus – Innovative Seismic Risk Management Operational System of the Ionian Islands» which is part of the Operational Program «Ionian Islands 2014-2020» and is co-financed by the European Regional Development Fund (ERDF) (National Strategic Reference Framework - NSRF 2014-20).

How to cite: Avlonitis, M., Kotinas, D., and Papadimitriou, E.: Quantization of large earthquakes driven by asperities strain concentration patterns, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-15193, https://doi.org/10.5194/egusphere-egu2020-15193, 2020.