Advancing Probabilistic Models in Earthquake Forecasting and Seismic Hazard
Recent catastrophic earthquakes have highlighted the importance of advancing seismic hazard models over a wide range of time frames, for example to support more reliable building codes and to track the short-term evolution of seismic sequences. Over the past years, the exponential growth of ground-motion data, short- and long-term forecasting models, hazard model test results, new engineering needs, and progress in research on earthquake predictability and ground-motion processes are creating a strong motivation for the exploration and incorporation of new concepts and methods into the next generation of probabilistic forecasts, both for long-term probabilistic seismic hazard assessment (PSHA), and operational earthquake forecasting. Owing to the important societal impact, any forecasting model has to be scientifically reliable. Prospective modeling is the best way of testing alternate hypotheses and models, and hence advancing our scientific understanding of the processes involved. Pragmatically, prospective testing provides an essential scientific contribution to improving the capacity to manage seismic hazard and risk in a wide range of forecasting time windows, for a broad range of stakeholders, including vulnerable societies. The development of such new and innovative long- and short-term forecasting/hazard models is a necessary but insufficient step: major advances in forecasting and hazard assessment require a solid testing phase that allows for model evaluation and quantifies any increase in forecasting skill over a benchmark model.
We solicit contributions related to new developments in all aspects of long- and short-term seismic hazard and earthquake forecasting models:
• Definition of earthquake sources and determination of activity rates and their uncertainty, including assessment of earthquake datasets, calibration of magnitude scales, representation of seismogenic sources and their geological constraints, and the emerging roles of strain and simulation-based earthquake-rupture forecasts.
• Development of innovative earthquake forecasting models with forecast horizons of days to decades.
• Estimation of strong ground motions and their uncertainty, development of new ground-motion models, assessment of site effects, the consideration of new parameters to characterize the intensity of shaking, and potential insights and uses of physics-based simulations of ground shaking.
• Testing and evaluation of hazard and earthquake forecasting models including statistical tests of
activity rates, earthquake occurrence, calibration of ground-motion models, hazard-model parameterization and implementation, sensitivity analyses of key parameters and results, as well as the development of innovative testing procedures.
• Case studies of PSHA from Europe and around the globe.
• Model building processes and related uncertainties, formal elicitation of expert opinion and its consequences for the levels of knowledge or belief, and comprehensive treatment of aleatory and epistemic uncertainties.
• Contributions related to the ongoing update of the Harmonized European Seismic Hazard model and the emerging EPOS infrastructure on hazard and risk.
Statistical analysis of spatio-temporal properties of earthquake occurrence
Earthquakes occur with great spatio-temporal variability, which emerges from the complex interactions between them. Significant progress is being made towards understanding spatio-temporal correlations, scaling laws and clustering, and the emergence of seismicity patterns. New models being developed in statistical seismology have direct implications for time-dependent seismic hazard assessment and probabilistic earthquake forecasting. In addition, the increasing amount of earthquake data available on local to global scales provides new opportunities for model testing.
This session focuses both on recent insights on the physical processes responsible for the distribution of earthquakes in space and time, and on new models and techniques for quantifying the seismotectonic process and its evolution. Particular emphasis will be placed on:
- physical and statistical models of earthquake occurrence;
- analysis of earthquake clustering;
- spatio-temporal properties of earthquake statistics;
- quantitative testing of earthquake occurrence models;
- implications for time-dependent hazard assessment;
- methods for earthquake forecasting;
- data analyses and requirements for model testing.
Confirmed solicited speaker: Danijel Schorlemmer (GFZ - German Research Center for Geosciences, Potsdam, Germany)
Seismic Hazard and Disaster Risk: Observations, Assessment, Testing, and Implementation
Our capability to provide timely and reliable seismic risk estimates is an essential element towards building a resilient society, through informed decision for risk management. The scientific base of the process of seismic risk mitigation includes various seismic hazard models, developed at different time scales and by different methods, as well as the use of information as complete and reliable as possible about past seismicity.
Some recent large earthquakes caused extensive damage in areas where some models indicated low seismic hazard, leading to an increased demand for criteria to objectively assess how well seismic hazard models are performing. This session aims to tackle theoretical and implementation issues, which are essential for the development of effective mitigation strategies and include:
⇒ methods for comparison of seismic hazard models and their performance evaluation;
⇒ hazard and risk assessment of extreme seismic events;
⇒ long-term evidences about past great earthquakes (including unconventional seismological observations, such as impact on caves, ancient constructions and other deformations evidences);
⇒ earthquake hazard assessment in terms of macro-seismic intensity;
⇒ seismic risk estimation at different time and space scale.
In particular, the session will address concepts, problems, and approaches in assessing hazard related to the earthquakes that “may cause loss of life, injury or other health impacts, property damage, loss of livelihoods and services, social and economic disruption, or environmental damage” (according to UNISDR terminology). The session will include discussions of the pros and cons of deterministic, neo-deterministic, probabilistic, and intensity-based seismic hazard assessments. The latter is of special importance for Europe because of the available large historical information on macro-seismic intensities.
We invite contributions related to: hazard and risk assessment methods and their performance in applications; critical observations and constraints for seismic hazard assessment; verification methods that are suitable to quantify seismic hazard estimates and that can be applied to limited and/or heterogeneous observations (ranging from recent records of ground shaking parameters to past intensity data); seismic hazard and risk monitoring and modeling; and risk communication and mitigation.
The session will provide an opportunity to discuss best practices and share experience gained with different testing methods, including their application in different fields. We hope to highlight both the existing gaps and future research directions that could strengthen the procedures for testing and comparing performance of seismic hazard models.
Methods and Tools for Natural Risk Management and Communications – Innovative ways of delivering information to end users and sharing data among the scientific community
In recent years an increasing number of research projects focused on natural hazards (NH) and climate change impacts, providing a variety of information to end user or to scientists working on related topics.
The session aims at promoting new and innovative studies, experiences and models to improve risk management and communication about natural hazards to different end users.
End users such as decision and policy makers or the general public, need information to be easy and quickly interpretable, properly contextualized, and therefore specifically tailored to their needs. On the other hand, scientists coming from different disciplines related to natural hazards and climate change (e.g., economists, sociologists), need more complete dataset to be integrated in their analysis. By facilitating data access and evaluation, as well as promoting open access to create a level playing field for non-funded scientists, data can be more readily used for scientific discovery and societal benefits. However, the new scientific advancements are not only represented by big/comprehensive dataset, geo-information and earth-observation architectures and services or new IT communication technologies (location-based tools, games, virtual and augmented reality technologies, and so on), but also by methods in order to communicate risk uncertainty as well as associated spatio-temporal dynamic and involve stakeholders in risk management processes.
However, data and approaches are often fragmented across literature and among geospatial/natural hazard communities, with an evident lack of coherence. Furthermore, there is not a unique approach of communicating information to the different audiences. Rather, several interdisciplinary techniques and efforts can be applied in order to simplify access, evaluation, and exploration to data.
This session encourages critical reflection on natural risk mitigation and communication practices and provides an opportunity for geoscience communicators to share best methods and tools in this field. Contributions – especially from Early Career Scientists – are solicited that address these issues, and which have a clear objective and research methodology. Case studies, and other experiences are also welcome as long as they are rigorously presented and evaluated.
New and innovative abstract contributions are particularly welcomed and their authors will be invited to submit the full paper on a special issue on an related-topics Journal.
In cooperation with NhET (Natural hazard Early career scientists Team).
Short-term Earthquakes Forecast (StEF) and multi-parametric time-Dependent Assessment of Seismic Hazard (t-DASH)
From the real-time integration of multi-parametric observations is expected the major contribution to the development of operational t-DASH systems suitable for supporting decision makers with continuously updated seismic hazard scenarios. A very preliminary step in this direction is the identification of those parameters (seismological, chemical, physical, biological, etc.) whose space-time dynamics and/or anomalous variability can be, to some extent, associated with the complex process of preparation of major earthquakes.
This session wants then to encourage studies devoted to demonstrate the added value of the introduction of specific, observations and/or data analysis methods within the t-DASH and StEF perspectives. Therefore studies based on long-term data analyses, including different conditions of seismic activity, are particularly encouraged. Similarly welcome will be the presentation of infrastructures devoted to maintain and further develop our present observational capabilities of earthquake related phenomena also contributing in this way to build a global multi-parametric Earthquakes Observing System (EQuOS) to complement the existing GEOSS initiative.
To this aim this session is not addressed just to seismology and natural hazards scientists but also to geologist, atmospheric sciences and electromagnetism researchers, whose collaboration is particular important for fully understand mechanisms of earthquake preparation and their possible relation with other measurable quantities. For this reason all contributions devoted to the description of genetic models of earthquake’s precursory phenomena are equally welcome. Every 2 years selected papers presented in thsi session will be proposed for publication in a dedicated Special Issue of an international (ISI) scientific journal.
Paleoseismicity, active faulting, surface deformation, and the implications on seismic hazard assessment (Fault2SHA)
The study of active faults and deformation of the Earth's surface has made, and continues to make, significant contributions to our understanding of earthquakes and the assessment of seismic related hazard.
Active faulting may form and deform the Earth's surface so that records are documented in young sediments and in the landscape. Field studies of recent earthquake ruptures help not only constraining earthquake source parameters but also the identification of previously unknown active structures. The insights gleaned from recent earthquakes can be applied to study past earthquakes. Paleoseismology and related disciplines such as paleogeodesy and paleotsunami investigations still are the primary tools to establish earthquake records that are long enough to determine recurrence intervals and long-term deformation rates for active faults. Multidisciplinary data sets accumulated over the years have brought unprecedented constraints on the size and timing of past earthquakes, and allow deciphering shorter-term variations in fault slip rates or seismic activity rates, as well as the interaction of single faults within fault systems. Based on the this rich, but very heterogeneous knowledge of seismogenic faults, a variety of approaches have been developed to tranfer earthquake-fault geology into fault models suitable for probabilistic SHA. This session thus aims at linking field geologists, crustal deformation modellers, fault modellers, and seismic hazard practitioners.
In this session, we welcome contributions describing and critically discussing different approaches to study active faults. We are particularly interested in studies applying new and innovative methodological or multidisciplinary approaches. We hope to assemble a broad program bringing together studies dealing with on-land, lake or offshore environments, and applying a variety of methods such as traditional paleoseismic trenching, high-resolution coring, geophysical imaging, tectonic geomorphology, and remote sensing, as well as the application of earthquake geology in seismic hazard assessments. In addition, we encourage contributors describing how to translate fault data or catalogue data into fault models for SHA , and how to account for faults or catalogue issues.
Tsunami (NH Division Outstanding ECS Lecture by Jadranka Šepić) (co-sponsored by JpGU)
Tsunamis can produce catastrophic damage on vulnerable coastlines, essentially following major earthquakes, landslides or atmospheric disturbances. After the disastrous tsunamis in 2004 and 2011, tsunami science has grown significantly, opening new fields of research for various domains, and also in regions where the tsunami hazard was previously underestimated.
Numerical modeling, complemented with laboratory experiments, are essential to quantify the tsunami hazard based. To this end, it is essential to rely on complete databases of past tsunami observations, including both historical events and results of paleotsunami investigations. Furthermore, a robust hazard analysis has to take into account uncertainties and probabilities with the more advanced approaches such as PTHA.
Because the vulnerability of populations, of infrastructures and of the built environment in coastal zones increases, integrated plans for tsunami risk prevention and mitigation should be encouraged in any exposed coastline, consistent with the procedures now in place in a growing number of Tsunami Warning System.
The NH5.1/OS2.22/SM3.11 Tsunami session welcomes contributions covering any of the aspects mentioned here, encompassing field data, regional hazard studies, observation databases, numerical modeling, risk studies, real time networks, operational tools and procedures towards a most efficient warning.
A focus on recent tsunami events all over the globe is encouraged (including Palu 28 September, Zakynthos 26 October, Tadine, New Caledonia, 5 December), as well as on the achievements of recent research projects.