NH 2021 Plinius Medal Lecture & 2020 Sergey Soloviev Medal Lecture & 2020 Division Outstanding ECS Award Lecture


NH 2021 Plinius Medal Lecture & 2020 Sergey Soloviev Medal Lecture & 2020 Division Outstanding ECS Award Lecture
Convener: Ira Didenkulova
| Wed, 21 Apr, 15:00–17:00 (CEST)

Session assets

Presentations: Wed, 21 Apr

Chairperson: Ira Didenkulova
Plinius Medal Lecture 2021
Giuliano Di Baldassarre

Plinius (23-79 AD) is known worldwide as the author of the encyclopedic Naturalis Historia. He died in Stabiae while trying to rescue his family from the eruption of Mount Vesuvius, one of the deadliest volcanic eruptions in European history that also destroyed the cities of Herculaneum and Pompeii. At that time, natural hazards were mostly seen as “acts of God(s)”. Instead, in today’s Anthropocene, extreme events coexist with two dichotomic (and rather simplistic) views: “disasters are natural” vs. “humans are to blame since they live in risky areas”. In this lecture, I present scientific and societal challenges associated with the increasing impact (from Plinius’ time to the Anthropocene) of humans on the spatial and temporal distribution of natural hazards. I also problematize and challenge myths, preconceptions and conventional wisdoms related with uncertainty, behavioral heuristics, expert vs. local knowledge, social power and inequalities. To this end, I review recent studies in various socioeconomic contexts, and across multiple hazards, with a focus on five events that have significantly influenced my research work: the 1963 Vajont Dam landslide, the 2004 flooding in Haiti and the Dominican Republic, the 2009 L’Aquila earthquake, the water crisis (Day Zero) during the 2015-2017 drought in Cape Town and the ongoing COVID-19 pandemic.

How to cite: Di Baldassarre, G.: Natural Hazards: from Plinius’ time to the Anthropocene, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-241,, 2021.

NH Division Outstanding ECS Award Lecture 2020
Vitor Silva

The increase in the global population, climate change, growing urbanization and settlement in regions prone to natural hazards are some of the factors contributing to the increase in the economic and human losses due to disasters. Earthquakes represent on average approximately one-fifth of the annual losses, but in some years this proportion can be above 50% (e.g. 2010, 2011). This impact can affect the sustainable development of society, creation of jobs and availability of funds for poverty reduction. Furthermore, business disruption of large corporations can result in negative impacts at global scale. Earthquake risk information can be used to support decision-makers in the distribution of funds for effective risk mitigation. However, open and reliable probabilistic seismic risk models are only available for less than a dozen of countries, which dampers disaster risk management, in particular in the under-developed world. To mitigate this issue, the Global Earthquake Model Foundation and its partners have been supporting regional programmes and bilateral collaborations to develop an open global earthquake risk model. These efforts led to the development of a repository of probabilistic seismic hazard models, a global exposure dataset, and a comprehensive set of fragility and vulnerability functions for the most common building classes. These components were used to estimate relevant earthquake risk metrics, which are now publicly available to the community.

The development of the global seismic risk model also allowed the identification of several issues that affect the reliability and accuracy of existing risk models. These include the use of outdated exposure information, insufficient consideration of all sources of epistemic and aleatory uncertainty, lack of results regarding indirect human and economic losses, and inability to forecast detailed earthquake risk to the upcoming decades. These challenges may render the results from existing earthquake loss models inadequate for decision-making. It is thus urgent to re-evaluate the current practice in earthquake risk loss assessment, and explore new technologies, knowledge and data that might mitigate some of these issues. A recent resource that can support the improvement of exposure datasets and the forecasting of exposure and risk into the next decades is the Global Human Settlement Layer, a collection of datasets regarding the built-environment between 1974 and 2010. The consideration of this type of information and incorporation of large sources of uncertainty can now be supported by artificial intelligence technology, and in particular open-source machine learning platforms. Such tools are currently being explored to predict earthquake aftershocks, to estimate damage shortly after the occurrence of destructive events, and to perform complex calculations with billions of simulations. These are examples of recent resources that must be exploited for the benefit of improving existing risk models, and consequently enhance the likelihood that risk reduction measures will be efficient.

This study presents the current practice in global seismic risk assessment with all of its limitations, it discusses the areas where improvements are necessary, and presents possible directions for risk assessment in the upcoming years.

How to cite: Silva, V.: Global Seismic Risk Assessment: the Wrong, the Right, and the Truth., EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-16282,, 2021.

Sergey Soloviev Medal Lecture 2020
John Clague

Frequency-magnitude relations derived from historic and prehistoric datasets underpin many natural hazard risk assessments. For example, probabilistic estimates of seismic risk rely on instrumented records of past earthquakes, in some cases supplemented by prehistoric seismicity inferred from proxy geologic evidence. Yet, there are several problems in these datasets that compromise the reliability of derived frequency-magnitude relations. In this presentation, I briefly discuss these problems. First, historic records of past events are temporally biased. Using seismicity as an example, earthquake catalogues are complete only for the past several decades, the period during which seismic networks have been sufficiently extensive to capture all events. During the first half of the twentieth century, small and even moderate earthquakes went unrecorded, and farther back in time, the occurrence of even large earthquakes is limited to eyewitness accounts. Prior to the last century, there is only limited knowledge of rare, but large events with low average return periods. Yet, low social and political tolerance for risk requires knowledge of events with return periods of hundreds to thousands of years. Temporal biases of this type result in huge uncertainties about the future occurrence of events with large return periods. A second limitation, which applies particularly to prehistoric events, is the large uncertainty in the times and magnitudes of events inferred using geologic proxy data. The example I use in this talk is the large debris-flow prone Cheekye River fan in southwestern British Columbia. Relatively small debris flows have happened on the fan in the historic period, and there is geologic evidence for several much larger prehistoric events during the Holocene. A new residential subdivision has been proposed for the apex of the fan, requiring that geologists estimate the sizes of debris flows with return periods up to 10,000 years. The Cheekye fan has been better studied than any other fan in western Canada, yet there are very large uncertainties in the sizes and times of events that are more than 100 years old. Event times are imprecise because radiocarbon ages carry inherent uncertainties of several decades to centuries. Furthermore, the geologic record of past events is incomplete. The frequency-magnitude curve for debris flows on Cheekye fan is ‘better than nothing’, but the very low societal tolerance for risk in Canada means that decisions about development on the fan likely will be based on worst-case scenarios of long return-period events that are poorly grounded in science. A third limitation that I highlight in my presentation pertains to weather-related hazards (floods, severe storms, and many landslides). An assumption made when using frequency-magnitude relations to evaluate hazard and risk is that the past can be applied to the near-future. This assumption is invalid for weather-related hazards, because climate is changing. Climate non-stationarity implies, for example, that historic hydrometric data, upon which flood frequency analyses were based in the past century may be of limited use in planning for future extreme floods.

How to cite: Clague, J.: Limitations in the Use of Past Datasets for Future Hazard Analysis, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-1826,, 2021.



  • Amir AghaKouchak, University of California, Irvine, United States of America
  • Marten Geertsema, BC Forest Service, Canada