Displays

NH9.1

The purpose of this session is to: (1) showcase the current state-of-the-art in global and continental scale natural hazard risk science, assessment, and application; (2) foster broader exchange of knowledge, datasets, methods, models, and good practice between scientists and practitioners working on different natural hazards and across disciplines globally; and (3) collaboratively identify future research avenues.
Reducing natural hazard risk is high on the global political agenda. For example, it is at the heart of the Sendai Framework for Disaster Risk Reduction and the Warsaw International Mechanism for Loss and Damage Associated with Climate Change Impacts. In response, the last 5 years has seen an explosion in the number of scientific datasets, methods, and models for assessing risk at the global and continental scale. More and more, these datasets, methods and models are being applied together with stakeholders in the decision decision-making process.
We invite contributions related to all aspects of natural hazard risk assessment at the continental to global scale, including contributions focusing on single hazards, multiple hazards, or a combination or cascade of hazards. We also encourage contributions examining the use of scientific methods in practice, and the appropriate use of continental to global risk assessment data in efforts to reduce risks. Furthermore, we encourage contributions focusing on globally applicable methods, such as novel methods for using globally available datasets and models to force more local models or inform more local risk assessment.
At various scales from global to local, many efforts on the collection and use of loss data related to natural hazards (e.g. cyclone, earthquake, flood, wildfire) as well as open datasets have been made in recent years. The integration of these socioeconomic loss databases and open datasets for loss and risk assessment allow for effective use for both science and policy, and to create a community linking academia, government and insurance.
We also encourage you to submit a manuscript to the NHESS special issue on Global- and continental-scale risk assessment for natural hazards (https://www.nat-hazards-earth-syst-sci.net/special_issue966.html). Deadline for submissions to the special issues is 31 December 2019.

Public information:
Public information:
The discussion of the displays in this session will be carried out in five blocks of 20 minutes. The authors who have indicated that they will present their Displays have been assigned to one of the blocks, and the time-schedule is as follows:
14:00-14:05: welcome and structure of the session
14:05-14:25: Finn Løvholt, Adrien Pothon, Krescencja Glapiak, Svetlana Stripajova
14:25-14:45: Jana Sillmann, Gaby Gründemann, Dominik Paprotny, Edwin Sutanudjaja
14:45-15:05: Oliver Wing (sollicited), Jerom Aerts, Dirk Eilander, Viet Dung Nguyen
15:05-15:25: Robert McCall, Samuel Eberenz, John Hillier, Maria Chertova
15:25-15:45: Claudia Wolff, Jacopo Margutti, Paola Salvati, Sara Lindersson
15:45: Closing remarks

Share:
Co-organized by AS4/HS2.5
Convener: Philip Ward | Co-conveners: Hannah Cloke, James DaniellECSECS, Hessel Winsemius, Jeroen Aerts, John K. HillierECSECS
Displays
| Attendance Mon, 04 May, 14:00–15:45 (CEST)

Files for download

Download all presentations (81MB)

Chat time: Monday, 4 May 2020, 14:00–15:45

Chairperson: Philip Ward, James Daniell, John Hillier
D2104 |
EGU2020-1999
Finn Løvholt, Jörn Behrens, Stefano Lorito, and Andrey Babeyko

The tsunami disasters of 2004 in the Indian Ocean and of 2011 along the Tohoku coast of Japan revealed severe gaps between the anticipated risk and consequences, with resulting loss of life and property. A similar observation is also relevant for the smaller, yet disastrous, tsunamis with unusual source characteristics such as the recent events in Palu Bay and Sunda Strait in 2018. The severe consequences were underestimated in part due to the lack of rigorous and accepted hazard analysis methods and large uncertainty in forecasting the tsunami sources. Population response to small recent tsunamis in the Mediterranean also revealed a lack of preparedness and awareness. While there is no absolute protection against large tsunamis, a more accurate analysis of the potential risk can help to minimize losses. The tsunami community has made significant progress in understanding tsunami hazard from seismic sources. However, this is only part of the inputs needed to effectively manage tsunami risk, which should be understood more holistically, including non-seismic sources, vulnerability in different dimensions and the overall societal effects, in addition to its interaction with other hazards and cascading effects. Moreover, higher standards need to be achieved to manage and quantify uncertainty, which govern our basis for tsunami risk decision making. Hence, a collective community effort is needed to effectively handle all these challenges across disciplines and trades, from researchers to stakeholders. To coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) the Global Tsunami Model network (GTM) was initiated in 2015 towards enhancing our understanding of tsunami hazard and risk from a local to global scale. Here, we focus on coordinated European efforts, sharing the same goals as GTM, towards improving standards and best practices for tsunami risk reduction. The networking initiative, AGITHAR (Accelerating Global science In Tsunami HAzard and Risk Analysis), is a European COST Action, aims to assess, benchmark, improve, and document methods to analyse tsunami hazard and risk, understand and communicate the uncertainty involved, and interact with stakeholders in order to understand the societal needs and thus contribute to their effort to minimize losses. In this presentation, we provide an overview of the suite of methodologies used for tsunami hazard and risk analysis, review state of the art in global tsunami hazard and risk analysis, dating back to results from the Global Risk Model in 2015, and highlight possible gaps and challenges. We further discuss how AGITHAR and GTM will address how to tackle these challenges, and finally, discuss how global and regional structures such as the European Plate Observing System (EPOS) and the UNDRR Global Risk Assessment Framework (GRAF) can facilitate and mutually benefit towards an integrated framework of services aiding improved understanding of multiple hazards.

How to cite: Løvholt, F., Behrens, J., Lorito, S., and Babeyko, A.: Coordinated efforts in tsunami hazard and risk analyses in Europe and link to the Global Tsunami Model network initiative, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-1999, https://doi.org/10.5194/egusphere-egu2020-1999, 2020

D2105 |
EGU2020-7748
Adrien Pothon, Philippe Guéguen, Sylvain Buisine, and Pierre-Yves Bard

Despite California being a highly seismic prone region, around 85% of people are not covered against this risk. This situation results from more than 100 years of evolution since the first earthquake insurance cover after the 1906 San Francisco earthquake. To understand this evolution, two analyses have been performed: the first one at the market level and the second one at the insured people level.

At the market level, as many variables as the premium amount, the risk monitoring, the funding sources of prevention plans, the insurance company’s solvency and the attractiveness of earthquake insurance solutions, have been investigated. By cross-analysing data collected and analysing the evolution with time, three different phases have been identified in the earthquake insurance market history.

At insured people level, a database is built from 18 different data sources about earthquake insurance, gathering data since 1921. Next, a new model is developed to assess the rate of homeowners insured against this risk, according to their risk awareness and the average annual insurance premium amount.

These two analyses are finally used to investigate in which extent the California earthquake insurance market could reach again 40% of people insured, like in 1993 and 1996. Even if results show that a widespread belief that a devastating earthquake is imminent could bring such a situation, only a new earthquake insurance model will allow to achieve this goal in a sustainable way. In that respect, the efficiency of two current initiatives to bring more people to get an earthquake insurance: "Earthquake Brace and Bolt" and "JumpStart Recovery", is assessed at the light of the analyses performed previously in this paper.

How to cite: Pothon, A., Guéguen, P., Buisine, S., and Bard, P.-Y.: An analysis of the California earthquake insurance market since its early stages, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-7748, https://doi.org/10.5194/egusphere-egu2020-7748, 2020

D2106 |
EGU2020-21287
Virginia Iglesias and William Travis

Combined, earthquakes, fires, floods, tornados and hurricanes are the most prominent natural disturbances in the United States that endanger human lives and result in substantial costs to society. Between 2006 and 2016, property and crop damage due to these hazards increased from ~5.4 to ~ 14.6 billion USD, with the record number of billion-dollar losses set in 2017. Unprecedented impacts and escalating costs highlight the imperative for better understanding of risk, which emerges from the coupling of disturbance probability and the exposure and adaptive capabilities of local communities. This study harmonizes earthquake, fire, flood, tornado and hurricanehazard data with fine-resolution annual settlement information to assess how risk due to changes in exposure has varied over the past 40 years across the contiguous U.S.Natural hazard risk assessments have been historically hindered by scale mismatches, poor characterization of property exposure and spatially-variable accuracy of the built environment. To overcome these limitations, we combined hazard occurrence data to create an integrated hazard map and employed gridded settlement layers from the Historical Settlement Data Compilation for the U.S. (HISDAC-US) derived from cadastral and housing data compiled in the Zillow Transaction and Assessment Dataset (ZTRAX) to map exposure. HISDAC-US describes the built environment of most of the country back to 1810 at fine temporal and spatial granularity. Trends in density of structures and built-up land were estimated for hazardous and non-hazardous areas (i.e., top and bottom 10% highest and lowest probabilities of a given hazard, respectively) as well temporal dynamics of risk at the subregional-, regional- and continental-scales.  Results suggest a monotonic increase in risk to all hazards as well as pronounced and rising spatial variability in exposure, pointing to long-standing institutional issues around equity and social justice. By assessing exposure at fine spatial resolution, with high temporal accuracy, and over long periods, we reliably identified populations at risk, and evaluated the development trajectories that lead to higher vulnerability to natural hazards.

How to cite: Iglesias, V. and Travis, W.: Escalating risk and pronounced inter-regional differences in exposure to natural hazards due to development patterns in the United States, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-21287, https://doi.org/10.5194/egusphere-egu2020-21287, 2020

D2107 |
EGU2020-211
Claudia Wolff, Theodore Nikoletopoulos, Jochen Hinkel, and Athanasios Vafeidis

The urban extent in the Low Elevation Coastal Zone (LECZ) is increasing faster than in the surrounding regions, which will lead to increased exposure to climate change-related hazards. Societies’ risk to these hazards will, therefore, depend on the rate and pattern of urban expansion and in what ways decision-makers will drive future urban development. One opportunity to investigate how urban development influences potential future coastal flood risk is to combine impact assessments with spatiotemporal urban land cover analysis and spatially explicit future urban projection. In this study, we have developed spatially explicit urban extent scenarios for 10 countries in the Mediterranean. The urban extent scenarios are quantitatively and qualitatively consistent with the assumptions of the global Shared Socioeconomic Pathways (SSPs). We employ a machine learning approach, namely Artificial Neural Networks (Multi-Layer Perceptron - MLP), to develop an Urban Change Model. The MLP model employs simple inputs as proxies for processes that drive urban development on a regional scale and estimates the likelihood of urban transformation for every grid cell between 2000 and 2012. In a next step, we calculate, for each SSP, the future urban land demand in 5-year time steps until 2100 and classify the ANN model outputs accordingly. These projections are then employed for calculating future exposure to coastal flooding.

The urban change models are able to reproduce the observed patterns of urban development with an overall accuracy of approximately 99% in all countries. The future projections indicate that accounting for the spatial patterns of coastal development can lead to significant differences in exposure. The increase in urban extent in the extended LECZ (below 20m) until 2100 varies, for instance,  by 67% (2075km²) for Italy, 104% (2331km²) for France (Mediterranean coast only) and 86% (691km²) in Greece depending on the urban development scenario chosen. This highlights that accounting for urban development in long-term adaptation planning, e.g. in the form of land-use planning is a very effective measure for reducing future coastal flood risk on a regional scale.

How to cite: Wolff, C., Nikoletopoulos, T., Hinkel, J., and Vafeidis, A.: What plausible urban coastal futures may look like? Spatially explicit urbanization projections for 10 Mediterranean countries., EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-211, https://doi.org/10.5194/egusphere-egu2020-211, 2019

D2108 |
EGU2020-21955
Ana Laura Costa, Elco Koks, Kees van Ginkel, Frederique de Groen, Lorenzo Alfieri, Francesco Dottori, Luc Feyen, and Thomas Bles

River flooding is among the most profound climate hazards in Europe and poses a threat to its road transport infrastructure. Traditional continental-scale flood risk studies do not accurately capture these disruptions because they are typically grid-based, whereas roads are relatively narrow line elements which are therefore omitted. Moreover, these grid-approaches disregard the network properties of roads, whereas the costs of reduced mobility could largely exceed the costs of the physical damage to the infrastructure.

We address these issues by proposing and applying an improved physical damage assessment coupled with the assessment of mobility disruption for a comprehensive risk assessment at a continental level.

In this study, we introduce an object-based, continental scale flood risk assessment of the European road network. We improve the estimates of direct, physical damage, by drawing road network data from OpenStreetMap, while making optimal use of the available metadata. We also introduce a set of road-specific flood damage functions, which are validated for an observed flood event in Germany. The results of this approach are compared to the traditional, grid-based approach to modelling road transport damage.

Next, we showcase how the object-based approach can be used to study potential mobility disruptions. In this study we present how the network data from OpenStreetMap and available metadata can be used to assess the flood impacts in terms of decreased connectivity, that is, increased distance, time and/or costs. The approach is flexible in physical scope, able to address national and continental resilience assessments and provide advice on tipping points of service performance. Furthermore, flexibility is also incorporated in terms of different resilience perspectives including decision-making by the asset owner or the national or trans-national supply chain disruption to a particular economic sector or company.

Finally, the risk assessment is discussed based on applications for the impacts of floods on European roads and the potential to extend to multi-hazard assessments (landslides, earthquakes, pluvial flooding) and other types of critical networks is discussed.  

 

This paper has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 776479 for the project CO-designing the Assessment of Climate CHange costs. https://www.coacch.eu/

 

How to cite: Costa, A. L., Koks, E., van Ginkel, K., de Groen, F., Alfieri, L., Dottori, F., Feyen, L., and Bles, T.: Object-based flood risk modelling of the European road network: physical damage and mobility disruptions, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-21955, https://doi.org/10.5194/egusphere-egu2020-21955, 2020

D2109 |
EGU2020-2400
| NH Division Outstanding ECS Lecture
Vitor Silva

The increase in the global population, climate change, growing urbanization and settlement in regions prone to natural hazards are some of the factors contributing to the increase in the economic and human losses due to disasters. Earthquakes represent on average approximately one-fifth of the annual losses, but in some years this proportion can be above 50% (e.g. 2010, 2011). This impact can affect the sustainable development of society, creation of jobs and availability of funds for poverty reduction. Furthermore, business disruption of large corporations can result in negative impacts at global scale. Earthquake risk information can be used to support decision-makers in the distribution of funds for effective risk mitigation. However, open and reliable probabilistic seismic risk models are only available for less than a dozen of countries, which dampers disaster risk management, in particular in the under-developed world. To mitigate this issue, the Global Earthquake Model Foundation and its partners have been supporting regional programmes and bilateral collaborations to develop an open global earthquake risk model. These efforts led to the development of a repository of probabilistic seismic hazard models, a global exposure dataset, and a comprehensive set of fragility and vulnerability functions for the most common building classes. These components were used to estimate relevant earthquake risk metrics, which are now publicly available to the community.

The development of the global seismic risk model also allowed the identification of several issues that affect the reliability and accuracy of existing risk models. These include the use of outdated exposure information, insufficient consideration of all sources of epistemic and aleatory uncertainty, lack of results regarding indirect human and economic losses, and inability to forecast detailed earthquake risk to the upcoming decades. These challenges may render the results from existing earthquake loss models inadequate for decision-making. It is thus urgent to re-evaluate the current practice in earthquake risk loss assessment, and explore new technologies, knowledge and data that might mitigate some of these issues. A recent resource that can support the improvement of exposure datasets and the forecasting of exposure and risk into the next decades is the Global Human Settlement Layer, a collection of datasets regarding the built-environment between 1974 and 2010. The consideration of this type of information and incorporation of large sources of uncertainty can now be supported by artificial intelligence technology, and in particular open-source machine learning platforms. Such tools are currently being explored to predict earthquake aftershocks, to estimate damage shortly after the occurrence of destructive events, and to perform complex calculations with billions of simulations. These are examples of recent resources that must be exploited for the benefit of improving existing risk models, and consequently enhance the likelihood that risk reduction measures will be efficient.

This study presents the current practice in global seismic risk assessment with all of its limitations, it discusses the areas where improvements are necessary, and presents possible directions for risk assessment in the upcoming years.

How to cite: Silva, V.: Global Seismic Risk Assessment: the Wrong, the Right, and the Truth, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-2400, https://doi.org/10.5194/egusphere-egu2020-2400, 2020

D2110 |
EGU2020-16040
Jana Sillmann, Simone Russo, Sebastian Sippel, Brian O'Neill, Monika Barcikowska, Claudia Ghisetti, and Marek Smid

Following the conceptual risk framing of the IPCC that defines risk as a function of hazard, exposure and vulnerability, we estimate global heatwave risk by using a statistical approach that combines the distribution of indicators for heatwave magnitude, population exposure and human development. We fit a general extreme value distribution (GEV) to the maxima of the heatwave magnitude index in a block of 10-years to estimate extreme heatwave conditions with a 500yr return period under current climate (HW500yr). To consider the impact of changes in heatwaves in populated regions of the world, we use a set of global, spatially explicit population projections that are consistent with the new Shared Socioeconomic Pathways (SSPs). As a proxy for vulnerability we use the Human Development Index (HDI) based on the geometric average of three dimensions: health, education and standard of living. We derive an illustrative heatwave risk indicator (expressed in %) for each location (i.e. grid box) of the globe as the product of the probability of occurrence of HW500Y multiplied by normalized population density and 1-HDI values with all components of the product being normalized. Using this illustrative heatwave risk indicator on a global scale, we project heatwave risk for global warming of 1.5 and 2 °C, in accordance with the Paris agreement, for two future pathways of societal development representing low and high vulnerability conditions (SSP1 and SSP4, respectively). This method demonstrates how including a measure of vulnerability could produce a distribution of risk that is different from the distribution of hazard or exposure under different scenarios. Our results show that the heatwave risk for the low and very high development countries would be significantly reduced if global warming is stabilized below 1.5 °C, and in the presence of rapid social development. The latter is most important for low development countries for decreasing their vulnerability towards heatwaves or other hazards being amplified by climate change. The results illustrate how hazard-specific policies could be better informed by analyses that account for vulnerabilities to the respective hazard. However, we also discuss several caveats associated with using a normalized risk indicator on a global scale with implications for the interpretation of risk on a local scale, including the need for better indicators to describe vulnerability to specific or multiple hazards.

How to cite: Sillmann, J., Russo, S., Sippel, S., O'Neill, B., Barcikowska, M., Ghisetti, C., and Smid, M.: A statistical approach to estimate global heatwave risk, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-16040, https://doi.org/10.5194/egusphere-egu2020-16040, 2020

D2111 |
EGU2020-19606
Viet Dung Nguyen, Ayse Duha Metin, Lorenzo Alfieri, Sergiy Vorogushyn, and Bruno Merz

Flooding is a major problem worldwide causing many fatalities and economic losses. The quantification of flood risk can be difficult for large spatial scales due to its spatial variability. The traditional risk assessment approaches assuming unrealistic spatial homogeneity of flood return period for the entire catchment are often used and hence in many cases lead to misleading results especially for large-scale applications. In this study, we aim at investigating the influences of spatial dependence in flood risk estimation over national and continental scales by comparing the assessments under three spatial dependence assumptions: modelled dependence (MD), complete dependence (CD) and complete independence (CI) of flow return periods. In order to achieve the aim, we develop a copula-based model representing the dependence structure of annual maximum stream flow (AMS) at 507 stations (with basin area > 500km2) across Europe and use it to generate long-term (10000 years) spatially coherent AMS at these locations. The generated series at multiple sites are then used for estimating associated flood loss considering two levels (with and without) of flood protection. The flood risk is estimated and aggregated for the representative 3 regions (England, Germany and Europe) and for the three dependence assumptions considering also the role of tail dependence of the used copulas. The results highlight that ignoring spatial dependence misestimates flood risk. The deviation from the modelled risk (under-/over-estimation) depends differently on the assumptions of spatial dependence, tail dependence, flood protection level and spatial scales. For example, under CD assumption for 200-year return period and considering flood protection, approximately 2.5-, 3- and 3.5-fold overestimation of flood risk in England, Germany and Europe, respectively, is found.

How to cite: Nguyen, V. D., Metin, A. D., Alfieri, L., Vorogushyn, S., and Merz, B.: Ignoring spatial dependence misestimates flood risk at the European scale, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-19606, https://doi.org/10.5194/egusphere-egu2020-19606, 2020

D2112 |
EGU2020-20048
Jacopo Margutti and Marc van den Homberg

Structured datasets of loss data, i.e. data on the impact of past natural disasters, are of paramount importance for informing disaster preparedness programs and forecasting the impact of future disasters. Most of existing initiatives aim at manually building such datasets from information of goverments, humanitarian agencies and researchers. Unfortunately, the quality and completeness of such information is often insufficient, especially for small disasters and/or in areas where these organisations are not active. More often, it's local and national newspapers that report on small disasters. In this contribution, we present a series of algorithms to automatically extract structured loss data from online newspapers, even small ones that are not captured by common news aggregator (e.g. Google News). The algorithms are validated both in terms of accuracy of extraction and consistency with existing datasets; we argue that they provide a valuable tool to collect loss data in data-poor regions.

How to cite: Margutti, J. and van den Homberg, M.: Text Mining of Loss Data, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-20048, https://doi.org/10.5194/egusphere-egu2020-20048, 2020

D2113 |
EGU2020-20590
Paola Salvati, Ivan Marchesini, Carmela Vennari, Marco Donnini, Cinzia Bianchi, Alessandro Sarretta, and Domenico Casarano

Most commonly, geo-hydrological hazards (i.e., landslide, flood, sinkhole) occur in response to a single trigger like an intense rainfall event, a prolonged rainfall period, a rapid snowmelt event, an earthquake. Multiple damaging processes (phenomena) occurring in response to a single trigger can cause a cumulative socio-economic impact, which is often difficult to quantify and to attribute to each single damaging processes (landslide, or a group of landslides, or a single inundation). As a consequence, after a geo-hydrological disaster occurs, media, insurance companies and international institutions publish numerous assessments of the cost of the disaster based on different methodologies and approaches, often reaching different results. At European level, EC Directives related to natural hazards prove standards for the collection of data focusing their attention mainly on codifying the processes, their attributes and their spatial extent, leaving out the important issue of rigorously classifying the damaged elements and the loss data. Lack of standards contributes negatively to the paucity of damage information and cost data, fundamental for the successive ex-post analysis aimed at quantitatively risk evaluation. In Italy, despite the frequency of the significant socio-economic impacts due to geo-hydrological hazards, few attempts have been made to estimate the economic cost of geo-hydrological hazards. These loss estimations are mainly based on cost components of the public budget for post-event restorations and reimbursements, hampering the possibility to distinguish between the private and public sector losses. The loss estimates do not distinguish the costs (i) by type of processes (landslides, flash floods, floods and other damaging events) responsible for the damage, and (ii) by expenditure items (restoration actions or mitigation activities). LAND-deFeND, a recently developed database structure, represents an effort to manage all the issues that can arise when storing, organizing and analysing information on losses related to geo-hydrological hazards with different levels of accuracy and at different geographical scales, from the national to the local scale.

How to cite: Salvati, P., Marchesini, I., Vennari, C., Donnini, M., Bianchi, C., Sarretta, A., and Casarano, D.: An approach to organize loss data related to geo-hydrological hazards, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-20590, https://doi.org/10.5194/egusphere-egu2020-20590, 2020

D2114 |
EGU2020-2429
Cyrielle Dollet and Philippe Guéguen

In moderate-to-low seismic hazard regions, estimating the socio-economic consequences of an earthquake on the urban scale is a costly and difficult task. This study analyses existing global earthquake databases to build a loss flat file of 445 earthquakes since 1967 with a magnitude greater than 4.5. The flat file includes information on the social consequences (fatalities, etc.) and economic losses (direct and indirect costs, number of buildings destroyed or damaged, etc.) of each earthquake. In this study, exposed population and GDP at the date of the earthquake complete the flat file information, estimated in relation to the ground motion footprint provided by USGS ShakeMap. TThe completeness of our catalog of social and economic losses is tested thanks to the creation of a synthetic database of losses of the 22 856 earthquakes between 1967 and 2015 affecting at least one country from the ISC-gem database. From these data, we propose a more realistic loss models. Then, occurrence models of human and direct economic losses relative to the exposed population and GDP per capita are derived showing that, although the number of casualties and the absolute magnitude of losses increase as a consequence of urban concentration, global losses relative to exposure decrease. Finally, the projection of future losses is discussed.

How to cite: Dollet, C. and Guéguen, P.: Modelling of the earthquake socio-economic and legal consequences at the urban scale, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-2429, https://doi.org/10.5194/egusphere-egu2020-2429, 2020

D2115 |
EGU2020-10797
Robert McCall, Ferdinand Diermanse, Daniel Twigt, Ellen Quataert, and Floortje Roelvink

The impact of extreme weather events on coastal areas around the world is set to increase in the future, both through sea level rise, climate change (increasing storm intensity, rainfall and droughts) and continued development and investment in hazard-prone deltaic and coastal environments. Given the changing natural and socio-economic environment, accurate predictions of current and future risk are becoming increasingly important world-wide to mitigate risks. Recent advances in computational power (e.g., cloud-computing) and data availability (e.g., growth of satellite-derived products) are enabling, for the first time, the development of global scale flood risk models for application in areas where local models are less well developed or prohibitively expensive, or for applications where a synoptic global coverage is important. Despite the increasing granularity of these global models and datasets however, they often still lack the resolution and accuracy to be “locally relevant”, especially where inundation and impact assessments are considered. While a solution to this problem is to downscale global models and datasets to the local scale, setting up local models is hampered by inconsistency between underlying datasets, and the required manual effort to generate downscaled integrated risk models inhibits their global application. To address these issues, we are developing a generalized risk assessment framework, called GHIRAF (Globally-applicable High-resolution Integrated Risk Assessment Framework), which couples data and models to quickly provide locally-actionable information on impact of historic, current- and future world-wide extreme weather events (e.g., storms, extreme rainfall, drought). The framework is designed to support world-wide efforts to reduce and mitigate risks associated with extreme weather events by aiding prevention (scenario-testing, design) and preparation (Early Warning) for extreme events, as well as support response (targeted relief efforts) and recovery (build-back-better) efforts. In this work we discuss application of the framework to study hurricane impacts on the eastern coast of the USA, as well as in data-poor, small island state environments.

How to cite: McCall, R., Diermanse, F., Twigt, D., Quataert, E., and Roelvink, F.: GHIRAF: closing the gap between global models and locally-actionable information, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-10797, https://doi.org/10.5194/egusphere-egu2020-10797, 2020

D2116 |
EGU2020-5121
| solicited
| Highlight
Oliver Wing, Niall Quinn, Paul Bates, Jeff Neal, Chris Sampson, and Andy Smith

Hydraulic modelling at large spatial scales is a field of enquiry approaching a state of maturity, with the flood maps produced beginning to inform wide-area planning decisions, insurance pricing and emergency response. These maps, however, are typically ‘static’; that is, are a spatially homogeneous representation of a given probability flood. Actual floods vary in their extremity across space: if a given location is extreme, you may expect proximal locations to be similarly extreme and distal locations to be decreasingly extreme. Methods to account for this stochastically can, broadly speaking, be split into: (i) continuous simulation via a meteorological-hydrological-hydraulic model cascade and (ii) fitting statistical dependence models to samples of river gauges, generating a synthetic event set of streamflows and simulating the hydraulics from these. The former has the benefit of total spatial coverage, but the drawbacks of high computational cost and the low skill of large-scale hydrological models in simulating absolute river discharge. The latter enables higher-fidelity hydraulics in simulating the extremes only and with more accurately defined boundary conditions, yet it is only possible to execute (ii) in gauge-rich regions – excluding most of the planet.

In this work, we demonstrate that a hybrid approach of (i) & (ii) offers a promising path forward for stochastic flood modelling in gauge-poor areas. Inputting simulated streamflows from large-scale hydrological models to a conditional exceedance model which characterises the spatial dependence of discharge extremes produces a very different set of plausible flood events than when observed flows are used as boundary conditions. Yet, if the relative exceedance probability of simulated flows – internal to the hydrological model – are used in place of their absolute values (i.e. a return period instead of a value in m3s-1), the observation- and model-based dependence models produce similar events in terms of the spatial distribution of return periods. In the context of flood losses, when using Fathom-US CAT (a state-of-the-art large-scale stochastic flood loss model), the risk of an example portfolio is indistinguishable between the gauge- and model-driven framework given the uncertainty in vulnerability alone. This is providing the model-based event return period is matched up with a hydraulic model of the same return period, yet where the latter is characterised via a gauge-based approach.

How to cite: Wing, O., Quinn, N., Bates, P., Neal, J., Sampson, C., and Smith, A.: Towards global stochastic flood modelling, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-5121, https://doi.org/10.5194/egusphere-egu2020-5121, 2020

D2117 |
EGU2020-1333
Konstantinos Karagiorgos, Daniel Knos, Jan Haas, Sven Halldin, Barbara Blumenthal, Andreas Pettersson, and Lars Nyberg

Pluvial floods are one of the most significant natural hazards in Europe causing severe damage to urban areas. Following the projected increase in extreme precipitation and the ongoing urbanization, these events play an important role in the ongoing flood risk management discussion and provoke serious risk to the public as well as to the insurance sector. However, this type of flood, remains a poorly documented phenomenon. To address this gap, Swedish Pluvial Modelling Analysis and Safety Handling (SPLASH) project aims to develop new methods and types of data that improve the possibility to value flood risk in Swedish municipalities by collaboration between different disciplines.

SPLASH project allows to investigating the impact of heavy precipitation along the entire risk modelling chain, ultimate needed for effective prevention. This study presents a pluvial flood catastrophe modelling framework to identify and assess hazard, exposure and vulnerability in urban context. An integrated approach is adopted by incorporating ‘rainfall-damage’ patterns, flood inundation modelling, vulnerability tools and risk management. The project is developed in the ‘OASIS Loss Modelling Framework’ platform, jointly with end-users from the public sector and the insurance industry.

The Swedish case study indicates that the framework presented can be considered as an important decision making tool, by establishing an area for collaboration between academia; insurance businesses and rescue services, to reduce long-term disaster risk in Sweden.

How to cite: Karagiorgos, K., Knos, D., Haas, J., Halldin, S., Blumenthal, B., Pettersson, A., and Nyberg, L.: Pluvial flood catastrophe modelling: a trans-disciplinary approach. , EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-1333, https://doi.org/10.5194/egusphere-egu2020-1333, 2019

D2118 |
EGU2020-5611
Samuel Eberenz, Samuel Lüthi, and David N. Bresch

Spatially explicit weather and climate risk assessments utilize impact functions (IFs) to translate hazard intensity into economic impact. However, global scale risk assessments are often lacking regionally calibrated IFs. In the case of tropical cyclones (TCs), IFs calibrated for the US have thus so far been applied for assessments of direct economic risk in other parts of the world. For example, in industrialized East Asian countries, where many TCs make landfall, these IFs lead to modelled damages orders of magnitude larger than reported. To improve the global representation of TC vulnerability, we calibrate regional IFs in a risk modelling framework with TC hazard intensity represented by high resolution maximum sustained wind speed. The calibration is based on track data and reported economic damages for 424 TCs making landfall on 62 countries worldwide, accounting for 76% of normalized reported damages from 1980 to 2017.

With our calibration we identify idealized IFs for eight world regions. These eight idealized IFs are based on two complementary optimization functions, one focusing on the deviation per event and the other on total damage per region. The IFs from the two approaches agree well for North America but deviate for other world regions. This reflects large uncertainties in model setup and input data. Sources of these uncertainties can be (i) ocean basin specific TC characteristics, (ii) the simplified representation of TC hazard intensity by wind speed alone, (iii) the assumed inventory of asset exposure per country, (iv) adaptation and development of the built environment over time, and (v) large uncertainties and biases in the reported damage data. In addition to the optimization, we computed best fitting IF steepness for each single TC event. Best fitting IF steepness shows a wide spread within each region, illustrating the model setup’s limitations when it comes to simulating the precise impact of single events. The spread in IF steepness can thus be used to inform probabilistic TC impact modelling beyond the use of a single deterministic IF, implicitly accounting for the uncertainties in model and input data.

Despite their limitations, the regionally calibrated IFs represent an improvement compared to globally applying IFs calibrated for the US, allowing for global scale TC risk assessments. The entire model setup is based on open data and made publicly available as part of TC module of the free, open-source natural catastrophe impact modelling project CLIMADA (https://github.com/CLIMADA-project/climada_python).

Future research should focus on a more adequate representation of TC hazard as a combination of wind, storm surge and torrential rain. In order to render this successful, it would be greatly beneficial for future reporting to contain information about this split, too. In addition, research on the reasons behind the large inter-regional differences in calibrated IF steepness would further improve our understanding on global TC vulnerabilities.

How to cite: Eberenz, S., Lüthi, S., and Bresch, D. N.: Global calibration of regional tropical cyclone impact functions, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-5611, https://doi.org/10.5194/egusphere-egu2020-5611, 2020

D2119 |
EGU2020-5732
Oliver Wing, Nicholas Pinter, Paul Bates, and Carolyn Kousky

Vulnerability functions are applied in flood risk models to calculate the losses incurred when a flood interacts with the built environment. Typically, these take the form of relative depth–damage relationships: flood depths at the location of a particular asset translate to a damage expressed as a certain percentage of its value. Vulnerability functions are a core component of risk and insurance industry catastrophe (CAT) models, permitting physical models of flood inundation under different scenarios (e.g. certain probabilities) to be translated to more tangible and useful estimates of loss. Much attention is devoted to the physical hazard component of flood risk models, but the final vulnerability component has historically received less attention — despite quantifications of risk being highly sensitive to these uncertain depth–damage functions. For the case of US flood risk models, ‘off-the-shelf’ functions from the US Army Corps of Engineers (USACE) are commonly used. In an analysis of roughly 2 million flood claims under the US National Flood Insurance Programme (NFIP), we find these ubiquitous USACE functions are not reflective of real damages at specified flood depths experienced by policy holders of the NFIP. Particularly for smaller flood depths (<1m), the majority of structural damages are of <10% of the building value compared to the 30–50% stipulated by the USACE functions. A deterministic relationship between depth and damage is shown to be invalid, with the claims data indicating damages at a certain depth form a beta distribution. Most reported damages are either <10% or >90% of building values, with the proportion of >90% damages increasing with water depth. The NFIP data also reveal that newer buildings tend to be more resilient (lower damages for a given depth), surface water flooding to be more damaging than fluvial flooding for a given depth, vulnerability to vary dramatically across space, and even the concept of a relative damage to be untenable in its application to expensive properties (e.g. even for depths >1m, properties worth >$250k rarely experience losses >20% of their value). The findings of this study have significant implications for developers of flood risk models, suggesting current estimates of US flood risk (in $ terms) may be substantial over-estimates.

How to cite: Wing, O., Pinter, N., Bates, P., and Kousky, C.: Patterns in US flood vulnerability revealed from flood insurance "big data", EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-5732, https://doi.org/10.5194/egusphere-egu2020-5732, 2020

D2120 |
EGU2020-6996
Gaby Gründemann, Ruud van der Ent, Hylke Beck, Marc Schleiss, Enrico Zorzetto, and Nick van de Giesen

Understanding the magnitude and frequency of extreme precipitation events is a core component of translating climate observations to planning and engineering design. This research aims to capture extreme precipitation return levels at the global scale. A benchmark of the current climate is created using the global Multi-Source Weighted-Ensemble Precipitation (MSWEP-V2, coverage 1979-2017 at 0.1 arc degree resolution) data, by using both classical and novel extreme value distributions. Traditional extreme value distributions, such as the Generalized Extreme Value (GEV) distribution use annual maxima to estimate precipitation extremes, whereas the novel Metastatistical Extreme Value (MEV) distribution also includes the ordinary precipitation events. Due to this inclusion the MEV is less sensitive to local extremes and thus provides a more reliable and smoothened spatial pattern. The global scale application of methods allows analysis of the complete spatial patterns of the extremes. The generated database of precipitation extremes for high return periods is particularly relevant in otherwise data-sparse regions to provide a benchmark for local engineers and planners.

How to cite: Gründemann, G., van der Ent, R., Beck, H., Schleiss, M., Zorzetto, E., and van de Giesen, N.: Globally estimated precipitation extremes, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-6996, https://doi.org/10.5194/egusphere-egu2020-6996, 2020

D2121 |
EGU2020-7301
Krescencja Glapiak

EGU General Assembly 2020 – abstract

 

The use of Environmental Science for decision making in Insurance

 

Krescencja Podgorska (Glapiak)[1], Dr John K. Hillier[2], Dr Andreas Tsanakas[3], Dr Melanie King[4], Dr Boyka Simeonova[5], Prof Alistair Milne[6]

There is considerable global interest in evidence-based decision making. An example of this is the use of geoscience within (re)insurance for natural hazards (e.g. geophysical, meteorological). These cause economic losses averaging 120 billion USD per year.  Modelling the risk of natural perils plays a vital part in the global (re)insurance sector decision-making. Thus, a 'model' comprising of a decision-making agenda/practices or software tools to form a 'view of risk' is a vital part of the (re)insurance sector’s decision-making strategy. Hence, the (re)insurance sector is of particular interest to environmental scientists seeking to engage with business, and it is relevant to ‘Operational Research’ studies as an example of a sophisticated user of complex models. Much is not understood about how such models shape organisational decision-making behaviour and their performance. Furthermore, the drivers for knowledge flow are distinct for each organization’s business model. Therefore, it is crucial to understand how environmental science propagates into key decision-making in the (re)insurance sector. Specifically, the relative strength of the various routes by which science flows into decision-making processes are not yet explicitly recorded. This study determines how geoscience is used in decision-making in (re)insurance (i.e. to form a ‘view of risk’), with the practical aim of providing evidence that academic geoscientists can use when commencing or developing their collaboration with this sector. Data include the views from 28 insurance practitioners collected at a dedicated session in the Oasis LMF conference 2018, a desk-based study of the scientific background of ‘C’-level decision makers, and insights gained through co-writing a briefing note of the observations  with  industry co-authors and a representative of the UK funding body UKRI. We show that catastrophe models are a significant and dominant means of scientific input into decision-making in organizations holding (re)insurance risk but that larger organisations often augment this with in-house teams that include PhD-level scientists.  Also, the strongest route that exists for academic scientists to directly input is via the ‘Model Adjustment’ function and technical specialists there (e.g. Catastrophe Risk Manager’), but a disconnect is observed in that key decisions are seen as being taken in the ‘Underwriting & Pricing’ function or by senior management which require a further step to propagate the environmental science internally.

 


[1] Doctoral Researcher, Geography and Environment, School of Social Sciences, Loughborough University

[2] Senior Lecturer in Physical Geography, Geography and Environment, School of Social Sciences, Loughborough University

[3] Reader in Actuarial Science, Cass Business School, Faculty of Actuarial Science and Insurance, CASS Business School, City, University of London

[4] Lecturer in Systems Engineering, School of Mechanical, Electrical and Manufacturing Engineering, Loughborough University

[5] Lecturer in Information Management, Deputy Director, Centre for Information Management Leader, KDE-RIG, Information Management at the School of Business and Economics, Loughborough University

[6] Professor of Financial Economics, School of Business and Economics, Loughborough University

How to cite: Glapiak, K.: The use of Environmental Science for decision making in Insurance , EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-7301, https://doi.org/10.5194/egusphere-egu2020-7301, 2020

D2122 |
EGU2020-7460
John Hillier, James Done, and Hamish Steptoe

Tropical cyclones (TCs) are one of the most costly natural hazards on Earth, and there is a desire to mitigate this risk. It is securely established that TC activity relates to ENSO in all oceanic basins (e.g. N. Atlantic). However, when a recent multi-basin review of correlation coefficients to ENSO was applied to a financial model of losses related to TCs, there appeared to be no significant inter-relationship between the losses between regions (e.g. US, China). It is therefore of interest to examine the chain of environmental and anthropogenic processes from TC genesis to financial loss to examine how correlations degrade. A number of hypotheses are statistically investigated, primarily using Spearman's coefficient and ranks to decouple dependency structures from the marginal distributions, but also Poisson regression.

How to cite: Hillier, J., Done, J., and Steptoe, H.: Exploring Inter-Basin Correlations of Tropical Cyclones and Tropical Cyclone Losses, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-7460, https://doi.org/10.5194/egusphere-egu2020-7460, 2020

D2123 |
EGU2020-7899
Juho Kupila

In January 2019, with the support of the project partners, Geological Survey of Finland launched a co-operation project “HazArctic – Geo-Bio Hazards in the Arctic Region”. It approaches both natural and man-made environmental hazards and risks in the Arctic and Subarctic areas, in Northern parts of Scandinavia and Russia.

Project studies, among others, the areal extension of harmful acid sulfate soils (ASS) and the role of microbes in the geo-bio interaction in possible hazardous environments. In addition, studies related to the mine environments, the stability of closed and open mines and mine tailings reuse, will be carried out. Project will also focus on training as well as analyzing and sharing of the best practices. Results will give a good background knowledge for participant organizations, local stakeholders and actors, authorities and public. Lessons learnt will guide actions in the future, for example farming in areas of ASS and in safety issues of the active and closed mines. Results will be available in public and can be adapted for use everywhere with similar conditions in natural or man-made environments.

The first year of the project produced new data about the areal extension of ASS in Finland, Sweden and Norway. Also the studies related to the geomicrobiology were started as well as the studies of the safety of the active open pit in Russia. Co-operation between project partners also created new networks in the field of research of the natural hazards. In 2020, started actions will continue and some new ones will start, for example the training related to the mine environments.

“HazArctic” is a co-operation project between Geological Surveys from Finland, Sweden and Norway, Natural Resources Institute Finland as well as Geological Institute and Mining Institute from the Federal Research Centre “Kola Science Centre of the Russian Academy of Sciences”. It is funded by Kolarctic CBC 2014-2020 program, EU, Russia, Norway, Sweden and Finland with the contribution from the project partners. Total budget is approx. 1.29 M€ and implementation period is 2019-2021.

How to cite: Kupila, J.: Cross-border co-operation as a baseline for management of natural and man-made hazards, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-7899, https://doi.org/10.5194/egusphere-egu2020-7899, 2020

D2124 |
EGU2020-8243
Philip Ward, Jerom Aerts, Steffi Uhlemann-Elmer, and Dirk Eilander

In this contribution we carried out a comprehensive comparison of flood hazard maps of 8 global flood models for the country of China based on a collective effort of the flood modelling community to participate in this study. In doing so, we assessed how differences in the simulated flood extent between the models lead to differences in simulated exposed GDP and expected annual exposed GDP. This is carried out by addressing the variation in different model structures and the variability between flood hazard maps. Our comparison uses both publicly available GFMs (GLOFRIS, ECMWF, CAMA-UT, JRC, and CIMA-UNEP) as well as industry models (Fathom, KatRisk, and JBA Risk Management) that are applied within the wider re-insurance industry. We expand upon the existing work of global flood model inter-comparison studies (e.g. Trigg et al., 2016; Bernhoffen et al., 2018) by including industry models, the pluvial flood component, and the effects of standards of protection on the flood hazard and exposure.

China is selected as our case study area because it poses many challenges to flood modelling: data scarcity; a variety of flood mechanisms spanning many climatic zones; complex topography; strong anthropogenic influence on the flood regimes, for example through river training; and a very high concentration of exposure. Moreover, China is prone to severe flood events. For example, in June 2016 alone more than 60 million people were affected by floods, resulting in an estimated damage of $22 bn (CRED 2016).

This abstract is an adaptation of “Global flood hazard map and exposed GDP comparison: a China case study” (Aerts et al., under review) and parts of this work have been presented at EGU 2019.

How to cite: Ward, P., Aerts, J., Uhlemann-Elmer, S., and Eilander, D.: Global flood hazard map and exposed GDP inter-comparison for China, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-8243, https://doi.org/10.5194/egusphere-egu2020-8243, 2020

D2125 |
EGU2020-9812
Inga Sauer, Ronja Reese, Christian Otto, Sven Willner, Katja Frieler, David Bresch, and Tobias Geiger

Recent studies of past changes in precipitation patterns suggest regionally varying but clearly detectable trends of global warming on physical flood indicators such as river discharge. Whether these trends are also visible in economic flood losses, has not yet been clearly answered, as changes in trends of damage records may be induced by either climatic or socio-economic drivers. In general, the socioeconomic impact of an extreme weather event is composed of three components: The hazard, the exposure of socioeconomic values to the event, and the vulnerabilities of the values, i.e., their propensity or predisposition to be adversely affected. In this work, we separate the historically observed trends in economic losses from river floods into the three contributions. We then quantify the effect of each driver on the overall change in economic losses from river floods between 1980 - 2010 for different world regions. In particular, this allows us to determine in which regions anthropogenic warming has already contributed to the observed trends in damages. We use flood depth as biophysical hazard indicator calculated by combining discharge simulations from 12 global hydrological models of the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP) model ensemble with the river-routing and flood inundation model CaMa-Flood. The hydrological models are driven by observed weather data and our simulations account for present-day protection standards from the FLOPROS database. Asset losses are estimated by combining gridded asset data with state-of-the-art flood damage functions translating flood depth to the fraction of affected assets employing the open-source socioeconomic impact modelling framework CLIMADA. Trends in modeled historical flood damages are then compared to observational damages by Munich Re’s NATCATService database in order to explain residual differences in trends by the three types of drivers. 

We first show that the method permits to reproduce the year-to-year variability observed damages on the regional level. We identify changes in exposure as the main driver of rising damage trends, but also observe significant - rising as well as declining - trends in flood hazards in several regions. Thus, effects of anthropogenic climate change that have already shown to unfold in discharge patterns, partly manifest already in economic damages, too. Residual trends in observed losses, that cannot be explained by changes in the hazard and the exposure alone, are caused by changes in vulnerability that can be well explained with trends in GDP per capita.  Mostly, rising regional income results in declining vulnerability to river floods, in particular in less developed world regions. However, we also find indications of maladaptation, i.e., in some regions, vulnerability increases with GDP per capita.

How to cite: Sauer, I., Reese, R., Otto, C., Willner, S., Frieler, K., Bresch, D., and Geiger, T.: Disentangling drivers of historical river flood losses, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-9812, https://doi.org/10.5194/egusphere-egu2020-9812, 2020

D2126 |
EGU2020-12270
Edwin Sutanudjaja

Over-consumption groundwater use is one of the major drivers in the hydrology of many major cities in the world, particularly in delta regions. However, a global assessment to identify cities with declining groundwater table problems has not been done yet. In this study we used the global hydrological model PCR-GLOBWB (10 km resolution) to do so. Using this model, we calculated groundwater recharge and river discharge/surface water levels, as well as groundwater abstraction. The output of PCR-GLOBWB model was then used to force a groundwater MODFLOW-based model simulating spatio-temporal groundwater head dynamics, including groundwater head declines in all major cities - mainly in delta regions - due to escalation in abstraction of groundwater to meet increasing water demand. Using this approach, we managed to identify a number of critical cities having groundwater table falling rates above 25 cm/year - average in the period 2000-2010 - such as Jakarta, Barcelona, Houston, Los Angeles, Mexico City, Rome and many large cities in China, Libya, India and Pakistan, as well as in the Middle East and Central Asia regions. However, our results overestimate head declines in Tokyo and some other places where groundwater depletion has been aggressively managed (e.g. groundwater abstraction has been minimized and replaced by importing surface water from other places). 

Currently, we are expanding this modeling simulation for the future period (i.e. until 2100), considering different climate scenarios (RCPs: 4.5 and 8.5) and socio-economic conditions (SSPs: 2 and 5). Our simulation results show that new and more cities with falling groundwater head problems would occur in the future, not only due to climate change (i.e. in areas that become dryer), but also due to increasing population, as well as expansion of existing urban areas and development of new urban areas.

How to cite: Sutanudjaja, E.: Modeling falling groundwater head declines in major cities of the world: current situation and future projection, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-12270, https://doi.org/10.5194/egusphere-egu2020-12270, 2020

D2127 |
EGU2020-13135
Andreas Schaefer, James Daniell, and Jens Skapski

The public attention to natural disasters increased significantly over the last couple of years. To some extend this can be linked to an increased activity of social media networking both from a scientific, but also informal perspective. For several years, CATnews provides impact metrics for earthquakes and other disasters on social media.

The Rapid Impact Modelling framework used by CATnews uses small-scale high-performance computation methods to quickly provide impact results. Such impacts are measured using the modified Mercalli intensity in case of earthquakes and by quantifying the amount of exposed population for each intensity level. For tsunamis or cyclones, quantitative scales like peak coastal wave heights and wind speeds are used. The resulting impact maps for earthquakes are usually hand calibrated using felt reports and testimonies from the affected locations combining-in with social media crowdsourcing and information from the official agencies. In addition, smart algorithms support the calibration to increase their overall accuracy. The rapid impact maps and metrics have been frequently cited by the media and used also for rapid loss estimation and overall disaster analysis.

Starting from a hobby project, CATnews became a well-received and frequently shared platform for natural disaster information, especially earthquakes, on social media. With more extensions to the platform on the way, it is hoped to become a primary source of information for public science on natural disasters.

In 2020, this will extend to being combined with CATDAT and the Earthquake Impact Database to develop a definitive socioeconomic loss database for earthquakes with a felt intensity archive (as well as tsunami archive) for the current year, and past years.

How to cite: Schaefer, A., Daniell, J., and Skapski, J.: CATnews - Rapid Impact Modelling of Earthquakes for Social Media in the 2020s, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-13135, https://doi.org/10.5194/egusphere-egu2020-13135, 2020

D2128 |
EGU2020-17831
Dirk Eilander, Anaïs Couasnon, Hiroaki Ikeuchi, Sanne Muis, Dai Yamazaki, Hessel Winsemius, and Philip Ward

Current global riverine flood risk studies assume a constant mean sea level boundary. In reality, high sea levels can propagate up a river leading to elevated water levels, and/or the drainage of high river discharge can be impeded by elevated sea levels. Riverine flood risk in deltas might therefore be underestimated if dynamic sea levels are ignored. This contribution presents the first global scale assessment of drivers of riverine flooding in deltas and underlines the importance of including dynamic downstream sea level boundaries in global riverine flood risk studies.

The assessment is based on extreme water levels at 3433 river mouth locations as modeled by the state-of-the-art global river routing model CaMa-Flood, forced with a multi-model runoff ensemble from the EartH2Observe project and bounded by dynamic sea level conditions from the global tide and surge model GTSM. Using this framework, we classified the drivers of riverine flooding at each location into four classes: surge dominant, discharge dominant, compound or insignificant. The classification is based on rank correlations between annual maximum riverine water levels and surge levels, and annual maximum riverine water levels and discharge. We developed a model experiment to quantify the effect of surge on flood levels and impacts.

We find that drivers of riverine flooding are compound at 19.7 % of the locations analyzed, discharge dominant at 69.2 % and surge dominant at 7.8 %. Compared to locations with either surge or discharge dominant flood drivers, locations with compound flood drivers generally have larger surge extremes, are located in basins with faster discharge response and/or flat topography. Globally, surge exacerbates 1-in-10 years flood levels at 64.0 % of the locations analyzed, with a mean increase of 13.5 cm. While this increase is the largest at locations with compound or surge dominant flood drivers, surge also affects flood levels at locations with discharge dominant flood drivers. A small decrease in 1-in-10 years flood levels is observed at 12.2 % of locations analyzed due to negative seasonal component of surge associated with dominant seasonal gyre circulations. Finally, we show that if surge is ignored, flood depths are underestimated for 38.2 million out of a total of 332.0 million (11.6 %) expected annual mean people exposed to riverine flooding.

How to cite: Eilander, D., Couasnon, A., Ikeuchi, H., Muis, S., Yamazaki, D., Winsemius, H., and Ward, P.: The effect of surge on riverine flood hazard and impact in deltas globally, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-17831, https://doi.org/10.5194/egusphere-egu2020-17831, 2020

D2129 |
EGU2020-18584
Daniel Lincke

Global coastal impact and adaptation analysis in the context of climate change induced sea-level rise needs precise and standardized datasets. Here, such datasets and their construction are presented. Starting from a high-resolution global digital elevation model, the coastline is extracted with taking into account river mouths and lagoons taken from a global surface water dataset. The global low-elevation coastal zone (LECZ) is derived by determining all grid cells hydrological connected to the coastline. Recent surge-data is combined with sea-level rise scenarios to partition the global LECZ into local floodplains. Latest socio-economic and land-use data is used to partition and classify these local floodplains. As local impacts and adaptation responses are not spatially uniform, but depend on a range of conditions including: i) biophysical conditions such as natural boundaries between floodplains (e.g. hills, rocks, etc.) and coastal geomorphology (e.g. sandy versus rocky shores), ii) technical conditions such as existing flood protection infrastructure (e.g. dike rings in the Netherlands), and ii) socio-economic conditions such as administrative boundaries, land use and urban extent (e.g. rural versus urban areas), latest land-use, beach and wetland datasets are used to partition the coastline of each floodplain into a network of coastline segments which can be used for assessing local shoreline management options.

The generated datasets contain about 1.6 million km of coastline distributed over 87,600 islands. The LECZ comprises 3.14 million km² and partitioning this LECZ with surge and sea-level rise data into floodplains for coastal impact modelling finds about 221,800 floodplains with at least 0.05 km² area.

How to cite: Lincke, D.: Grid-based global coastal zone datasets for coastal impact analysis, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-18584, https://doi.org/10.5194/egusphere-egu2020-18584, 2020

D2130 |
EGU2020-18894
Luis Mediero, Enrique Soriano, David Santillán, Luis Cueto-Felgueroso, and Luis Garrote

Flood risk assessment studies require information about direct damages that depend on several variables, such as water depth, water velocity, flood duration, activity sector and type of building, among others. However, loss functions are usually simplified through flow depth-direct damage curves. Direct flood damages driven by a given flood event can be estimated directly from such loss functions by using either known or estimated water depths.

In his study, flow depth-direct damage curves are estimated for a set of activity sectors in the Pamplona metropolitan area located in the northern part of Spain, within the activities of the EIT Climate-KIC SAFERPLACES project. A dataset containing all flood direct damages in the Pamplona metropolitan area in the period 1996-2018 were supplied by the Spanish ‘Consorcio de Compensación de Seguros’ (CSC), benefiting from the fact that CSC is the insurance company that covers all damages produced by natural hazards in Spain. Flood direct damages are classified by activity sectors and postal codes. In addition, observed streamflow data at a set of gauging sites in the Ulzama and Arga rivers were supplied by both the Ebro River Basin Authority and the Regional Government of Navarre. A set of seven flood events with both streamflow and direct damage data available were selected. Flood hydrographs in the Arga River at Pamplona were obtained with a temporal resolution of 15 minutes. The Regional Government of Navarre supplied the real flood extensions for a set of flood events. With such real flood extensions, the two-dimensional hydrodynamic IBER model was calibrated. Flood extensions and water depths with a spatial resolution of 1 m were estimated with the calibrated hydrodynamic model for the seven flood events. Combining the dataset of direct damages with standard flow depth-direct damage curves and with water depths simulated by the hydrodynamic model, flood depth-damage curves were estimated by municipalities and postal codes. Such curves were obtained for activity sectors, considering residential, commercial and industrial assets.

Acknowledgments

This study was supported by the project SAFERPLACES funded by the EIT Climate-KIC. The authors also acknowledge the ‘Consorcio de Compensación de Seguros’ for providing the flood direct damage dataset and the Regional Government of Navarre for providing the real flood extensions for given flood events.

How to cite: Mediero, L., Soriano, E., Santillán, D., Cueto-Felgueroso, L., and Garrote, L.: Estimation of flood loss functions by activity sectors in the Pamplona metropolitan area in Spain by using insurance flood direct damage data , EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-18894, https://doi.org/10.5194/egusphere-egu2020-18894, 2020

D2131 |
EGU2020-18947
Lulu Liu, Shaohong Wu, and Jiangbo Gao

Risk of climate-related impacts results from the interaction of climate-related hazards (including hazardous events and trends) with the vulnerability and exposure of human and natural systems. Despite the commitment of the Paris Agreement, the integrate research on climate change risk combining risk‐causing factors and risk‐bearing bodies, the regional differences in climate impacts are still missing. In this paper we provide a quantitative assessment of hazards and socioeconomic risks of extreme events, risks of risk‐bearing bodies in China under global warming of 1.5 and 2.0°C based on future climate scenarios, and quantitative evaluation theory for climate change risk. For severe heat waves, hazards might significantly intensify. Affected population under 2.0°C warming might increase by more than 60% compared to that of 1.5°C. Hazards of severe droughts and floods might strengthen under Representative Concentration Pathway 8.5 scenario. Economic losses might double between warming levels of 1.5 and 2.0°C, and the population affected by severe floods might continuously increase. Under the integrate effects of multiple disasters, the regions with high population and economic risks would be concentrated in eastern China. The scope would gradually expand to the west with socioeconomic development and intensification of extreme events. High ecological risks might be concentrated in the southern regions of the Yangtze River Basin, while the ecological risk in northern China would expand. High agriculture yield risks might be distributed mainly in south of the North China Plain, the Sichuan Basin, south of the Yangtze River, and west of Northwest China, and the risk levels might continuously increase.

How to cite: Liu, L., Wu, S., and Gao, J.: Integrate risk from climate change in China under global warming of 1.5°C and 2.0°C, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-18947, https://doi.org/10.5194/egusphere-egu2020-18947, 2020

D2132 |
EGU2020-19193
Shaohong Wu

Risk assessment of climate change is a base for adaptation. Quantitative methods are the key step for the risk assessment. This study tries to establish a quantitative methodology for assessment of risks from climate change through elements of the dangerousness of risk causing factors, the exposure and vulnerability of risk bearing bodies, and their interrelations. In this study, risks were divided into abrupt hazard risk and gradual hazard risk. Quantified risk calculating models were established with the damage degrees from regression of historical hazard losses, vulnerability curve construction. Then risk assessment was done on three main extreme events, heat-wave, drought and flood. According to the current status and needs, this study projects the future risks of climate change under different climate scenario. The method was applied in China and West Africa.

How to cite: Wu, S.: Quantitatively assessment and application of climate change risk, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-19193, https://doi.org/10.5194/egusphere-egu2020-19193, 2020

D2133 |
EGU2020-19311
Sylvie Parey and Paul-Antoine Michelangeli

Electricity generation and demand is highly dependent on the weather conditions and especially temperature. Ongoing climate change has already modified the very hot extremes in Europe, and this is projected to continue in the future. The anticipation of the necessary adaptations in the electricity sector necessitates information on the possible extreme levels susceptible to occur in the next decades or further future periods. This study aims at comparing different ways of producing maps of extreme temperature levels for different future periods. Extreme temperatures are defined here as an example as 20-year Return Levels, that is temperatures reached or exceeded on average once in 20 years over the considered period. The computation of the Return Levels is based on the methodology described in Parey et al. 2019, which consists in applying the statistical extreme value theory to a standardized variable. It can be proven that the extremes of this variable can be considered as stationary. Then, the changes in mean and variance of the summer temperature projected by different climate models from the CMIP5 archive can be used to derive Return Levels for any selected future period.

Producing maps necessitates the use of a dataset with a large geographical coverage over Europe. Such datasets are typically gridded, either based on spatial interpolations of station records or on reanalysis products. However, both spatial interpolation and model assimilation tend to smooth the local highest values. Thus, in order to analyze the impact of such smoothing, the Return Levels computed in the same way from different datasets: the European Climate Assessment and dataset station data, the gridded EOBS database or the ERA5-Land database are computed and compared for different future periods.

How to cite: Parey, S. and Michelangeli, P.-A.: Mapping extreme hot temperatures in Europe and their evolutions: sensitivity to data choices, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-19311, https://doi.org/10.5194/egusphere-egu2020-19311, 2020

D2134 |
EGU2020-21189
Maria Chertova, Sanne Muis, Inti Pelupessy, and Philip Ward

Coastal flooding due to tropical cyclones (TC) is one of the world’s most threatening hazards. The potential increase in the probability of these events in the future, due to climate change, necessitates the more accurate simulation of their potential hazard and resulting risks. This contribution is a step of a MOSAIC (MOdelling Sea level And Inundation for Cyclones) project that aims at developing and validating a computationally efficient, scalable, framework for large-scale flood risk assessment, combining cutting-edge disciplinary science and eScience technologies. As the first step, we develop a computationally efficient method for more accurately simulating current and future TC hazard and risk, by incorporating large datasets of tropical cyclones within the Global Tide and Surge Model (GTSM). The starting point is simulating the spatially explicit extreme sea levels for a large number of synthetic TCs. The difficulty lies in high computational time required for running GTSM models, as with duration of one simulation running on 24 cores of 5 days ( for 1yr). Until present each TC was simulated separately*, which is not feasible when modelling thousands of TC events. Here we present the development of an algorithm for the spatio-temporal optimization of the placing of TCs within GTSM in order to allow optimal use of the computational resources. This can be achieved because the region of influence of a particular TC in the model is limited in space and time (e.g. a TC making landfall in Florida will not materially affect water levels near New York).  This will enable running a large number of TCs in one simulation and will significantly reduce the required total computation time. We investigate a large range of parameters, such as distance between cyclones, time to the landfall, category of cyclone, and others, to optimize the distribution of TC within a single model run. We demonstrate a significant speedup relative to the sequential running of the cyclones within a single simulation.

*Muis, S., Verlaan, M., Winsemius, H. C., Aerts, J. C. J. H., & Ward, P. J. (2016). A global reanalysis of storm surge and extreme sea levels. Nature Communications, 7(7:11969), 1–11.

How to cite: Chertova, M., Muis, S., Pelupessy, I., and Ward, P.: Incorporating large datasets of synthetic tropical cyclones with Global Tide and Surge Model (GTSM) for global assessment of extreme sea levels. , EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-21189, https://doi.org/10.5194/egusphere-egu2020-21189, 2020

D2135 |
EGU2020-21355
Svetlana Stripajova, Peter Pazak, Jan Vodicka, and Goran Trendafiloski

The presence of thick soft alluvial sediment-filled basins, like in river’s deltas, can significantly amplify and prolongate the earthquake ground motion. Moreover, the high-water saturation of such soft sediments and cyclic earthquake loading can lead to liquefaction. The basin and liquefaction effect can contribute to substantial modification of the seismic motion and increase of the potential losses at a particular location. Well-known examples of such high financial losses during earthquakes for basin effect is Mw 8.1 Mexico City 1985 and for liquefaction is Darfield and Christchurch earthquakes series in 2010 and 2011. Thus, the quantification of these effects is particularly important for the current underwriting products and the industry requires their further detailed consideration in the catastrophe models and pricing approaches. Impact Forecasting, Aon’s catastrophe model development center of excellence, has been committed to help (re)insurers on that matter.

This paper presents case study of the quantification of the basin effect and liquefaction for Vancouver region, Canada for specific scenario Mw 7.5 Strait of Georgia crustal earthquake. The southern part of the Vancouver region is located on a deep sedimentary basin created in the Fraser River delta. In case of deep Vancouver sedimentary basin considering amplification only due to shallow site response Vs30-dependent site term is not sufficient. Therefore, we derived (de)amplification function for different periods to quantify basin effect. We used NGA – West 2 ground motion prediction equations (GMPEs) for crustal events which include basin depth term. Amplification function was derived with respect to standard GMPEs for crustal events in western Canada. Amplification, considering site response including Vs30 and basin depth term at period 0.5 s can reach values as high as 3 at the softest and deepest sediments. The liquefaction potential was based on HAZUS and Zhu et al. (2017) methodologies calibrated to better reflect local geological conditions and liquefaction observations (Monahan et al. 2010, Clague 2002). We used USGS Vs30 data, enhanced by local seismic and geologic measurements, to characterize soil conditions, and topographical data and IF proprietary flow accumulation data to characterize water saturation. Liquefaction hazard is calculated in terms of probability of liquefaction occurrence and permanent ground deformation. For the chosen scenario the potential contribution to mean loss due to basin effect could be in the range 15% - 30% and 35% - 75% due to liquefaction depending on structural types of the buildings.

How to cite: Stripajova, S., Pazak, P., Vodicka, J., and Trendafiloski, G.: The basin effect and liquefaction in the catastrophe models: Case study – Vancouver region, Canada, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-21355, https://doi.org/10.5194/egusphere-egu2020-21355, 2020

D2136 |
EGU2020-4753
Dominik Paprotny, Heidi Kreibich, Oswaldo Morales-Nápoles, Dennis Wagenaar, Attilio Castellarin, Francesca Carisi, Xavier Bertin, Paweł Terefenko, Bruno Merz, and Kai Schröter

Floods affect many types of tangible assets in Europe. Here, we present novel methods of estimating exposure and vulnerability of four important categories of assets: (1) residential buildings, (2) household contents, (3) non-residential buildings, (4) machinery and equipment. One set of models was developed for “residential” assets (1+2) and another one for “commercial” assets (3+4) within the activities of EIT Climate-KIC “SaferPLACES” project. All methods are applicable within Europe using only openly-available datasets at the level of individual buildings. Residential exposure is calculated through combination of estimated useful flood space of buildings and average national replacement costs of buildings and household contents. We devised a method for estimating building height using OpenStreetMap and pan-European raster datasets and transforming it into floor space area. Combining economic and demographic data, we calculated replacement costs of residential assets per m2 of floor space in 30 European countries, covering years 2000–2017. Regarding commercial assets, we created a method to disaggregate economic statistics to obtain detailed building-level estimates of replacement costs of non-residential buildings and machinery/equipment. Finally, we developed two Bayesian Network-based, probabilistic, multivariate models to estimate relative losses to residential and commercial assets. Those combine post-disaster survey data from Germany with information on flood hazard and vulnerability; data from events in the Netherlands and Italy are also used in the residential damage model. The exposure and damage models were tested for a case study of the 2010 coastal flood in France, which is a challenge especially for the damage models built on the basis of fluvial and pluvial floods. Using water depths from hydrodynamic modelling of the event, we found that the models underestimated both residential and commercial losses in the two French departments most affected by the 2010 event, but shown a very close alignment for average losses recorded in the whole affected area. Good performance of the models was achieved in all investigated subcategories (residential buildings’ structure, household contents, agricultural establishments, companies in services & industry). Our models were compared with alternative damage models and exposure estimates, showing the best accuracy in most aspects analyzed.

How to cite: Paprotny, D., Kreibich, H., Morales-Nápoles, O., Wagenaar, D., Castellarin, A., Carisi, F., Bertin, X., Terefenko, P., Merz, B., and Schröter, K.: Flood exposure and vulnerability estimation methods for residential and commercial assets in Europe, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-4753, https://doi.org/10.5194/egusphere-egu2020-4753, 2020

D2137 |
EGU2020-5117
Sara Lindersson, Luigia Brandimarte, Johanna Mård, and Giuliano Di Baldassarre

The availability of planetary-scale geospatial datasets that can support the study of water-related disasters in the Anthropocene is rapidly growing. We review over hundred global and free datasets allowing spatiotemporal analyses of floods, droughts and their interactions with human societies. The purpose of structuring a data collection for a broad range of variables is to foster data exchange and illustrate research opportunities across scientific disciplines.

Our collection of datasets confirms that the availability of geospatial data capturing hydrological hazards and exposure is far more mature than those capturing vulnerability aspects. We do not only highlight data applications by listing a selection of recent studies exploiting these global datasets, but also discuss challenges associated with using these datasets in comparative studies.  Specific discussion points include spatiotemporal resolution and coverage, inequalities of geographic representation, omission of detailed information at large scales, data consistency and accessibility, and dependencies between the datasets prohibiting interaction studies.

How to cite: Lindersson, S., Brandimarte, L., Mård, J., and Di Baldassarre, G.: Global and free geospatial datasets for the study of floods, droughts and human societies, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-5117, https://doi.org/10.5194/egusphere-egu2020-5117, 2020