Session 3 | State-of-the-art in computational geosciences

Session 3

State-of-the-art in computational geosciences
Convener: Victor Vilarrasa | Co-coveners: Jorge Macias, Alice-Agnes Gabriel
Orals
| Thu, 25 May, 09:00–13:00
Poster
| Attendance Thu, 25 May, 17:00–18:00
Orals |
Thu, 09:00
Thu, 17:00
This session will consider the current state and the transition of European computational geosciences. Which are the emerging perspectives, bottlenecks and challenges in geosciences and geohazards and HPC services for a safer world (hazards and operational services).

Orals: Thu, 25 May | Sala d'Actes

09:00–10:00
|
GC11-solidearth-61
|
solicited
Nicola Castelletto

Carbon capture and storage (CCS) is one of the most important technologies to achieve large-scale reduction in global carbon dioxide (CO2) emissions. The essence of CCS is to capture CO2 produced at power plants and industrial facilities and transport it to safe, permanent storage deep underground. Reducing CO2 emissions into the atmosphere is crucial to cut the carbon footprint of our society. The evaluation of CO2 storage candidate sites requires predictive simulation capabilities to assess site capacity and safety. We present an overview of the GEOS multiphysics simulation platform, an open-source simulator capable of serving as the computational engine for CCS evaluation workflows. We will discuss the development path of GEOS, and motivations to transition from a collection of smaller single-institution code development efforts to a multi-institution collaboration. We will describe the development of a discretization-data infrastructure, a standardized approach to solving single and coupled physics problems, and a strategy to achieve reasonable levels of performance portability across hardware platforms. We will outline the approach to documentation, and planned method of user interaction as the growth of that user base accelerates.

 

Portions of this work were performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07-NA27344.

 

How to cite: Castelletto, N.: Simulation of Geological CO2 Storage with the GEOS Open-Source Multiphysics Simulator, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-61, https://doi.org/10.5194/egusphere-gc11-solidearth-61, 2023.

10:00–10:15
|
GC11-solidearth-22
Arnau Folch

The second phase (2023-2026) of the EuroHPC Center of Excellence for Exascale in Solid Earth (ChEESE-2P), funded by HORIZON-EUROHPC-JU-2021-COE-01 under the Grant Agreement No 101093038, will prepare 11 European flagship codes from different geoscience domains. Codes will be optimised in terms of performance on different types of accelerators, scalability, containerisation, and continuous deployment and portability across tier-0/tier-1 European systems as well as on novel hardware architectures emerging from the EuroHPC Pilots (EuPEX/OpenSequana and EuPilot/RISC-V) by co-designing with mini-apps. Flagship codes and workflows will be combined to farm a new generation of 9 Pilot Demonstrators (PDs) and 15 related Simulation Cases (SCs) representing capability and capacity computational challenges selected based on their scientific importance, social relevance, or urgency. On the other hand, the first phase of ChEESE was pivotal in leveraging an ecosystem of European projects and initiatives tackling computational geohazards that will benefit from current and upcoming exascale EuroHPC infrastructures. In particular, Geo-INQUIRE (2022-2024, GA No 101058518) and DT-GEO (2022-2025, GA No 101058129) are two on-going Horizon Europe projects relevant to the Solid Earth ecosystem. The former will provide virtual and trans-national service access to data and state-of-the-art numerical models and workflows for monitoring and simulation of the dynamic processes in the geosphere at unprecedented levels of detail and precision. The later will deploy a prototype Digital Twin (DT) on geophysical extremes including 12 self-contained Digital Twin Components (DTCs) addressing specific hazardous phenomena from volcanoes, tsunamis, earthquakes, and anthropogenically-induced extremes to conduct precise data-informed early warning systems, forecasts, and hazard assessments across multiple time scales. All these initiatives liaise, align, and synergise with EPOS and longer-term mission-like initiatives like Destination Earth.

How to cite: Folch, A.: HPC projects in the Solid Earth ecosystem, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-22, https://doi.org/10.5194/egusphere-gc11-solidearth-22, 2023.

10:15–10:30
|
GC11-solidearth-14
Louise Cordrie, Jacopo Selva, Fabrizio Bernardi, Roberto Tonini, Jorge Macías Sánchez, Carlos Sánchez Linares, Steven Gibbons, and Finn Løvholt

Tsunami urgent computing procedures quantify the potential hazard due to a seismically-induced tsunami right after an earthquake, that is from minutes to a few hours. The hazard is quantified by simulating the tsunami from source to shore, taking into account the uncertainty in the source parameters and the uncertainty associated with the wave generation, propagation, and inundation.

In the European eFlows4HPC project, an HPC workflow for urgent computing of tsunami hazard assessment is currently being developed, consisting of the following steps: 1) retrieval of parameters for the tsunamigenic earthquake (magnitude, hypocentre and associated uncertainties), 2) definition of a seismic source ensemble, 3) simulation of the tsunami generated by each scenario in the ensemble, 4) aggregation of the results to produce an estimate of tsunami hazard, which also incorporates a basic treatment of uncertainty modelling and 5) update of the ensemble based on incoming data.

Initially implemented on the Power-9 machine at BSC, the workflow has been fully embedded into a PyCOMPSs framework that enables parallel task execution and integrates full tsunami simulations for the first time. The tsunami numerical model (Tsunami-HySEA) computes the tsunami from the source to coastal impact using nested grids with resolution from kilometres to meters.

To limit the number of simulations and converge faster towards stable hazard estimates, new methods for defining the seismic source ensembles have been developed. When applied to several past earthquakes and tsunamis (e.g., the 2003 Boumerdes and the 2017 Kos-Bodrum earthquakes), our new sampling strategy yielded a reduction of 1 or 2 orders of magnitude for ensemble size, allowing a drastic reduction in the computational effort. This reduction may be exploited to improve tsunami simulation accuracy, increasing the computational effort available for each simulation for the same overall cost. The workflow also allows the integration of new incoming data (focal mechanism, seismic or tsunami records) for an “on the fly” update of the PTF based on this new information.

The improvement of the workflow through a well-defined ensemble of scenarios, realistic simulations and integration of incoming data, strongly reduces the uncertainty and yields to an update of the probabilistic forecasts without compromising theiraccuracy. This can be crucial in mitigating the risk far from the seismic source, and in improving risk management by better informing decision-making in an emergency framework.

How to cite: Cordrie, L., Selva, J., Bernardi, F., Tonini, R., Macías Sánchez, J., Sánchez Linares, C., Gibbons, S., and Løvholt, F.: Complete workflow for tsunami simulation and hazard calculation in urgent computing using HPC services, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-14, https://doi.org/10.5194/egusphere-gc11-solidearth-14, 2023.

Coffee break
11:00–11:15
|
GC11-solidearth-35
Carlos Paredes, Marcos David Márquez, and Miguel Llorente

Remote sensing data and numerical simulation models of lava flows have been combined to assess the possibility of rapid, real-time response during the effusive crisis of the recent Cumbre Vieja (Sep 19 - December 13, 2021) eruptive episode. Here, we use the monitoring products openly distributed by COPERNICUS through the Emergency Management Service (EMSR546) and by the Cabildo Insular de la Palma (daily-updated polygons of the extent of the lava flow) and the lava flow emplacement as simulated with VolcFlow, during the first seven days of the eruption, and supported by the location of the effusive foci provided by the IGN. The morphometric analysis of the satellite data has allowed us to estimate, assuming a non-Newtonian behaviour of the lava, the flows emitted, and their viscosities, using expressions based on the morphological dimensions, their advancing speed, and their density. The morphometric values thus obtained have been used as initial conditions for the numerical calibration, which has been done by minimising the Jaccard coefficient used as the objective function, but other geometric measures can be used as functionals to be minimised. Although we have designed and presented a task flow as a procedure to perform a dynamic numerical semiautomatic calibration over time of the rheological parameters necessary to simulate the day-to-day evolution of the lava flooding zone, based on the remote information recorded, for its validation we have carried out the search for the solution to the optimisation problem manually.

The results have allowed us to obtain a Jaccard coefficient between 85% and 60% with a calculation time, including calibration, of less than 7 days of simulated lava flow.  Also, an emission rate of about 140 m3/s of lava flow has been estimated, during the first 24 h of the eruptive process, which decreased asymptotically to 60 m3/s. This value can be cross-checked with information from other remote sources using TADR. Viscosity varied from 8 × 106 Pa s, or a yield strength of 42 × 103 Pa, in the first hours, to 4 × 107 Pa s and 35 × 103 Pa, respectively, during the remainder of the seven days. In addition, the modelling allowed mapping the likely evolution of the lava flow fields until 27 September, with an acceptable lava height distribution for the highest values and a Jaccard coefficient of 85%, in order to determine the behaviour of the available response time, according to the computational cost, for the numerical estimation of the rheology and to generate real-time forecasts of the evolution.

This integration of satellite data with numerical model calibration for parametric estimation of the La Palma 2021 eruption holds great promise for providing a real-time response system to other future volcanic eruptions and providing the necessary information, mainly in the form of dynamic evolution maps, for efficient emergency preparedness and management.

How to cite: Paredes, C., Márquez, M. D., and Llorente, M.: Can we model lava flows faster than real-time to assist on a first volcanic emergency response?, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-35, https://doi.org/10.5194/egusphere-gc11-solidearth-35, 2023.

11:15–11:30
|
GC11-solidearth-23
|
ECS
Iva Tojčić, Cléa Denamiel, and Ivica Vilibić

Meteotsunami events – tsunami-like ocean waves driven by atmospheric disturbances – are, by nature, rare, specific to certain geographical regions and highly variable in time. Consequently, the coastal hazards due to these types of events are known to be difficult to forecast with state-of-the art numerical models presently applied around the world.

In order to help the local communities to better prepare for these destructive events (e.g., set temporary protection against flooding and waves, avoid swimming, etc.) and minimize the losses, the Croatian Meteotsunami Early Warning System (CMeEWS) has been recently implemented in the Adriatic Sea in the testing mode. The CMeEWS is mostly based on the Adriatic Sea and Coast (AdriSC) modelling suite and uses an innovative deterministic-stochastic approach for extreme sea-level event predictions. It provides meteotsunami hazard forecasts depending on (1) daily deterministic forecasts by coupled kilometer-scale atmosphere-ocean models, (2) atmospheric observations and (3) stochastic forecasts of extreme sea-level distributions at endangered locations derived with a surrogate model approach. Some of these steps require substantial computational resources and needs an optimized data flow which, at end, defines the operability of the service.

Here, the advantages but also the drawbacks of such an approach will be presented through several applications of the AdriSC modelling suite during meteotsunami events in the Adriatic Sea. The future challenges concerning meteotsunami extreme sea-level modelling will be discussed and some potential avenues to further develop the model skills will be considered.

How to cite: Tojčić, I., Denamiel, C., and Vilibić, I.: Modelling of extreme sea-level hazards: state-of-the-art and future challenges, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-23, https://doi.org/10.5194/egusphere-gc11-solidearth-23, 2023.

11:30–12:30
12:30–13:00

Poster: Thu, 25 May, 17:00–18:00 | Poster area (V217)

P1
|
GC11-solidearth-1
|
ECS
Thomas Y. Chen

Due to anthropogenic climate change, the frequency and intensity of natural disasters is only increasing. As supercomputing capabilities increase in an era of computational geosciences, artificial intelligence has emerged as a key tool in assessing the progression and impact of these disasters. Recovery from extreme weather events is aided by machine learning-based systems trained on multitemporal satellite imagery data. We work on shifting paradigms by seeking to understand the inner decision-making process (interpretability) of convolutional neural networks (CNNs) for damage assessment in buildings after natural disasters, as these deep learning algorithms are typically black boxes. We compare the efficacy of models trained on different input modalities, including combinations of the pre-disaster image, the post-disaster image, the disaster type, and the ground truth of neighboring buildings. Furthermore, we experiment with different loss functions, and find that ordinal cross entropy loss is the most effective criterion for optimization. Finally, we visualize inputs by creating gradient-weighted class activation mapping (Grad-CAM) on the data, with the end goal of deployment. Earth observation data harnessed by deep learning and computer vision is not only useful for disaster assessment, but also in understanding the other impacts of our changing climate from marine ecology to agriculture in the Global South.

How to cite: Chen, T. Y.: Climate Adaptation and Disaster Assessment using Deep Learning and Earth Observation, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-1, https://doi.org/10.5194/egusphere-gc11-solidearth-1, 2023.

P2
|
GC11-solidearth-20
Thomas Y. Chen and Luke Houtz

As climate change advances, machine learning approaches have been developed to detect and assess impacts to infrastructure and communities by natural hazards and extreme weather events. One especially useful source of data is Earth observation and satellite imagery. For example, deep neural networks are trained on multitemporal Earth observation data to be able to perform inference on pairs of high-resolution pre-earthquake and post-earthquake imagery and output a prediction of the damage severity for any given area. Different modalities of imagery, such as infrared imagery, can be particularly useful for detecting damage. As state-of-the-art models are trained and published in the scientific literature, it is important to consider the deployability of the algorithms. Key bottlenecks to successful deployment in the exascale era include the interpretability of neural networks, computer visualization of outputs, as well as runtime dependencies of the model and memory consumption. An additional consideration is the climate impact of the significant computing resources required by large, complex models that are supposed to aid in climate adaptation. We discuss various methods of real-world deployment, including the use of drones that analyze multitemporal change in real time. Finally, we emphasize the importance of bias mitigation in machine learning and computer vision models, examining recent cutting-edge techniques like the REVISE tool, which thoroughly probes geographical biases in big data. This is required because AI requires a large quantity of data and the predictions on unseen data it makes are contingent on the data it has already seen.

How to cite: Chen, T. Y. and Houtz, L.: Operationalizing deployable hazard detection technology based on machine learning, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-20, https://doi.org/10.5194/egusphere-gc11-solidearth-20, 2023.

P3
|
GC11-solidearth-2
|
ECS
Ayyoub Sbihi, Hajar El talibi, Hasnaa Harmouzi, and Mohamed Mastére

Morocco is one of the countries with a long geological history, tracing several orogenies. The most recent, called alpine, was at the origin of the formation of the Rifian chain by the collision of the two tectonic plates African and Eurasian. This activity continues to predominate because of the continuous approximation of the plates and the punching of the Alboran microplate. This results, among other things, in the decompression of rock masses and the reopening of inherited discontinuities. These, being associated with other soil-geological, climatic parameters, topographical and anthropogenic, make the Rif unquestionably the area most exposed to natural hazards including phenomenal of land instability. The effects of this hydro-gravity hazard are all the more important when they affect more or less vulnerable inhabited areas.

The region of Al Hoceima is part of the Rif, it presents several indices of instabilities. While some areas remain relatively stable, others are subject to factors that may generate ground movement.

The objective of this work is to analyze the relationship between the mapped ground movements of Al Hoceima province and the key geological parameters, namely lithology, and fracturing. Using the GIS tools, we analyzed the spatial distribution with the different classes of the two parameters mentioned above, using a two-stage geostatistical analysis.

 

Keywords : Risk, Cartography, GIS, Remote sensing, ground movements, Geostatistics, Al Hoceima.

How to cite: Sbihi, A., El talibi, H., Harmouzi, H., and Mastére, M.: Contribution of geomatic tools for the study of geological control of ground movements in the province of Al Hoceima - Northern Morocco, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-2, https://doi.org/10.5194/egusphere-gc11-solidearth-2, 2023.

P4
|
GC11-solidearth-3
Tomaso Esposti Ongaro and Mattia de' Michieli Vitturi

Explosive volcanic eruptions are characterized by the ejection in the atmosphere of volcanic gases and fragments of magma and/or lithics at high temperature, pressure and velocity. They encompass a broad range of magnitudes, with volumes of ejecta spanning from less than 106 m3, to 109-1011 m3 of Plinian eruptions, up to the largest known volcanic events, able to erupt up to thousands of km3 of magma. Phreatic eruptions are among the smallest in this range; they do not involve the eruption of fresh magma, but are instead triggered by a sudden rise of pressure and temperature in a shallow magmatic-hydrothermal system. Despite their relatively small size, phreatic eruptions are frequent on Earth and difficult to anticipate, and represent therefore a significant hazard, testified by the recent eruptions in Tongariro’s Te-Maari crater (NZ, 2012), and during the tragic development of events in Ontake (JP, 2014) and Whakaari/White Island (NZ, 2019).

The main challenges of the numerical simulation of explosive volcanic phenomena have traditionally been identified in the complex fluid dynamics of polydisperse multiphase mixtures (with particle grains ranging from a few microns to metres) and in the extremely broad range of relevant dynamical scales characterizing compressible turbulent flows of gas and particles in the atmosphere. Three-dimensional, high-performance computer models based on different approximations of the multiphase flow theory have been designed to simulate the fluid dynamics of explosive eruptions, and to define hazard and impact scenarios. However, until now, it was difficult to quantify the uncertainty associated with numerical predictions.

We here discuss the present bottlenecks and challenges of the 3D modelling of phreatic volcanic eruptions in the quest for urgent definition of impact scenarios and probabilistic hazard assessment at Vulcano island (Aeolian archipelago, Italy). Exascale computing in these applications offers the opportunity to increase the complexity of the physical model (including new key processes as the flashing of liquid water), to describe the wide range of lithic fragments ejected during the eruption, to achieve unprecedently high spatial resolution at the source and close to the terrain, and to perform large ensembles of numerical simulations to quantify the epistemic uncertainty associated with the model initial and boundary conditions.

Challenges associated with the development, maintenance and porting on new HPC architectures of numerical models are finally discussed.

How to cite: Esposti Ongaro, T. and de' Michieli Vitturi, M.: Three-dimensional, multiphase flow numerical models of phreatic volcanic eruptions., Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-3, https://doi.org/10.5194/egusphere-gc11-solidearth-3, 2023.

P5
|
GC11-solidearth-5
|
ECS
Auregan Boyet, Silvia De Simone, and Víctor Vilarrasa

Fluid injection in subsurface reservoirs often induces seismicity, which is a limiting factor in the deployment of geo-energies, as it is the case for Enhanced Geothermal Systems (EGS). EGS are commonly deep granitic reservoirs subject to hydraulic stimulation in order to enhance the fracture permeability and consequently the heat production. Injection-induced seismicity occurs also after the stop of injection, and in many cases the largest earthquakes occur after the shut-in. The counterintuitive post-injection large magnitude seismicity is still not well understood and its modelling is necessary to improve the understanding of the processes triggering the seismicity. Pressure-driven processes, as pore pressure increase and poroelastic stress/strain variations, have been identified as triggers of seismicity, together with stress interactions, thermal disparities and geomechanical interactions. We design a coupled hydro-mechanical 2D model of the well-known case of post-injection induced seismicity of Basel EGS (Deep Heat Mining Project at Basel, Switzerland, 2006). We use CODE_BRIGHT, a finite element method software able to perform monolithic coupled thermo-hydro-mechanical analysis in geological media. The faults respond to a Mohr-Coulomb failure criterion with strain weakening and dilatancy, which allow to simulate fault reactivation and its aperture variation. The model is able to reproduce the pressure and stress variations, and the consequent fault reactivations through the simulations. The Basel EGS has been well documented and its characteristics are available. We are able to reproduce the spatio-temporal induced seismicity. Yet, our current numerical method takes long computational time. To speed up simulations, we simplify the model geometry by grouping faults that yield similar static stress transfer, computed with the code Coulomb3, which solves an analytical solution to compute stress changes caused by fault slip. The combination of numerical with analytical solutions is an effective way of obtaining faster computing models. By simultaneously assimilating monitoring data in real-time with an efficient computing model would enable a better understanding of the fluid-injection effects on the stability of the reservoir, and potentially the mitigation of the induced seismicity.

How to cite: Boyet, A., De Simone, S., and Vilarrasa, V.: Hydro-mechanical modeling of injection-induced seismicity at the Deep Heat Mining Project of Basel, Switzerland, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-5, https://doi.org/10.5194/egusphere-gc11-solidearth-5, 2023.

P6
|
GC11-solidearth-6
|
ECS
Iman R. Kivi, Roman Y. Makhnenko, Curtis M. Oldenburg, Jonny Rutqvist, and Victor Vilarrasa

The majority of available climate change mitigation pathways, targeting net-zero CO2 emissions by 2050, rely heavily on the permanent storage of CO2 in deep geologic formations at the gigatonne scale. The spatial and temporal scales of interest to geologic carbon storage (GCS) raise concerns about CO2 leakage to shallow sediments or back into the atmosphere. The assessment of CO2 storage performance is subject to huge computational costs of numerically simulating CO2 migration across geologic layers at the basin scale and is therefore restricted in practice to multi-century periods. Here, we present a computationally affordable and yet physically sound model to understand the likelihood of CO2 leakage over geologic time scales (millions of years) (Kivi et al., 2022). The model accounts for vertical two-phase flow and transport of CO2 and brine in a multi-layered system, comprising a sequence of aquifers and sealing rocks from the crystalline basement up to the surface (a total thickness of 1600 m), representative of a sedimentary basin. We argue that the model is capable of capturing the dynamics of CO2 leakage during basin-wide storage because the lateral advancement of CO2 plume injected from a dense grid of wellbores transforms into buoyant vertical rise within a short period after shut-in. A critical step in the proposed model is its initialization, which should reproduce the average CO2 saturation column and pressure profiles. We initialize the model by injecting CO2 at a constant overpressure into an upper lateral portion of the target aquifer while the bottom boundary is permeable to brine, resembling brine displacement by CO2 plume or leakage at basin margins. The optimum model setting can be achieved by adjusting the brine leakage parameter through calibration. We solve the governing equations using the finite element code CODE_BRIGHT. Discretizing the model with 7,100 quadrilateral elements and using an adaptive time-stepping scheme, the CPU time for the simulation of CO2 containment in the subsurface for 1 million years is around 140 hours on a Xeon CPU of speed 2.5 GHz. The obtained results suggest that the upward CO2 flow in free phase is strongly hindered by the sequence of caprocks even if they are pervasively fractured. CO2 leakage towards the surface is governed by the intrinsically slow molecular diffusion process, featuring aqueous CO2 transport rates as low as 1 meter per several thousand years. The model shows that GCS in multi-layered geologic settings is extremely unlikely to be associated with leakage, implying that GCS is a secure carbon removal technology. 

 

Reference

Kivi, I. R., Makhnenko, R. Y., Oldenburg, C. M., Rutqvist, J., & Vilarrasa, V. (2022). Multi-layered systems for permanent geologic storage of CO2 at the gigatonne scale. Geophysical Research Letters, 49, e2022GL100443. https://doi.org/10.1029/2022GL100443

How to cite: Kivi, I. R., Makhnenko, R. Y., Oldenburg, C. M., Rutqvist, J., and Vilarrasa, V.: A computationally efficient numerical model to understand potential CO2 leakage risk within gigatonne scale geologic storage, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-6, https://doi.org/10.5194/egusphere-gc11-solidearth-6, 2023.

P7
|
GC11-solidearth-24
|
ECS
Cláudia Reis, André R. Barbosa, Denis Istrati, Stephane Clain, Rui Ferreira, Luis Matias, and Erin Wirth

Regional and local tsunami sources are a cliché of scientific disaggregation. From the physical perspective, despite emerging studies on cascading hazard and risk, hazard characterization often sees the tsunami as an individual event without addressing the effects of the primary hazard (typically a high-magnitude earthquake) that triggered the tsunami. Moreover, tsunami effects are partitioned into single processes: hydraulic effects or induced effects, such as debris transport, which is a representative approach often assumed when treating complex phenomena. From a technical perspective, describing cascading hazards and translating them into a composite loading pattern for natural and built environments is challenging, and the difficulty increases exponentially when fluid-soil-interactions are considered. From a modeling perspective, physical and numerical simulations are employed to complement scarce databases of extreme tsunami events. However, the level of modeling sophistication deemed necessary to reproduce such complex phenomena is elevated and there are uncertainties associated with natural phenomena and their modelling, ranging from the genesis of the tsunami to structural and community response. The number and influencing potential of uncertainties pose an extraordinary concern when developing mitigation measures. From a risk management perspective, cascading natural and anthropogenic hazards constitutes a challenge for combining safety requirements with financial, social, and ecological concerns. Risk management can benefit from strengthening the ties between natural hazards and engineering practitioners, linking science and industry, and promoting dialogue between risk analysts and policy-makers.

Ultimately, risk management requires heterogeneous data and information from real and synthetic origins. Yet, the quality of data used for risk management may often depend on the computational resources (in terms of performance, energy, and storage capacity) needed to simulate complex multi-scale and multi-physics phenomena, as well as to analyze large data sets. For example, the quality of the numerical solutions is often dependent on the amount of data used to calibrate the models and the runtime of the models needs to be aligned with time constraints (ex.: faster than real time tsunami simulations for early warning systems). The North American platform Hazus is capable of producing risk maps. In the European risk assessment, there is a lack of integration and interaction of results from GEM and SERA, and TSUMAPS-NEAM projects, intended to develop seismic and tsunami hazard studies, respectively. The computational modeling aids in the advancement of scientific knowledge by aggregating the numerous factors involved and their translation to tsunami risk management policies.

A global trend in geosciences and engineering is to develop sophisticated numerical schemes and to build computational facilities that can solve them, thereby aiming to reduce uncertainty levels and preparing the scientific (r)-evolution for the so-called Exascale Era. The present work aims to gather multidisciplinary perspectives on a discussion about: 1) challenges to overcome on tsunami risk management, such as sophistication of earthquake and tsunami numerical schemes; 2) uncertainty-awareness and future needs to develop unanimous and systematic measures to reduce uncertainties associated with geophysical and engineering processes; 3) pros and cons of using HPC resources towards safety and operational performance levels; and 4) applicability to critical infrastructures.

How to cite: Reis, C., Barbosa, A. R., Istrati, D., Clain, S., Ferreira, R., Matias, L., and Wirth, E.: Tsunami risk management in the Exascale Era:Global advances and the European standpoint, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-24, https://doi.org/10.5194/egusphere-gc11-solidearth-24, 2023.

P8
|
GC11-solidearth-11
|
ECS
Juan Francisco Rodríguez Gálvez, Jorge Macías Sáncez, Manuel Jesús Castro Díaz, Marc de la Asunción, and Carlos Sánchez-Linares

Operational Tsunami Early Warning Systems (TEWS) are crucial for mitigation and highly reducing the impact of tsunamis on coastal communities worldwide. In the North-East Atlantic, the Mediterranean, and connected Seas (NEAM) region, these systems have historically utilized Decision Matrices for this purpose. The very short time between tsunami generation and landfall in this region makes it extremely challenging to use real-time simulations to produce more precise alert levels and the only way to include a computational component in the alert was to use precomputed databases. Nevertheless, in recent years, computing times for a single scenario have been progressively reduced to a few minutes or even seconds depending on the computational resources available. In particular, the EDANYA group at the University of Málaga, Spain, has focused on this topic and developed the GPU code Tsunami-HySEA for Faster Than Real Time (FTRT) tsunami simulations. This code has been implemented and tested in TEWS of several countries (such as Spain, Italy, and Chile) and has undergone extensive testing, verification and validation.

In this study, we propose the use of neural networks (NN) to predict the maximum height and arrival time of tsunamis in the context of TEWS. The advantage of this approach is that the inference time required is negligible (less than one second) and that this can be done in a simple laptop. This allows to consider uncertain input information in the data and still providing the results in some seconds. As tsunamis are rare events, numerical simulations using the Tsunami-HySEA are used to train the NN model. This part of the workflow requires producing a large amount of simulations and large HPC computational resources must be used. We utilized the Tsunami-HySEA code and the Spanish Network for Supercomputing (RES), to apply neural networks (NN) and obtain the numerical results.

Machine learning (ML) techniques have gained widespread adoption and are being applied in all areas of research, including tsunami modeling. In this work, we employ Multi-Layer Perceptron (MLP) neural networks to forecast the maximum height and arrival time of tsunamis at specific locations along the Chipiona-Cádiz coast in Southwestern Spain. In the present work, initially several individual models are trained and we show that they provide accurate results. Then ensemble techniques, which combine multiple single models in order to reduce variance, are explored. The ensemble models often produce improved predictions.

The proposed methodology is tested for tsunamis generated by earthquakes on the Horseshoe fault. The goal is to develop a neural network (NN) model for predicting the maximum height and arrival time of such tsunamis at multiple coastal locations simultaneously. The results of our analysis show that deep learning is a promising approach for this task. The proposed NN models produce errors of less than 6 cm for the maximum wave height and less then 212 s for the arrival time for tsunamis generated on the Horseshoe fault in the Northeastern Atlantic.

How to cite: Rodríguez Gálvez, J. F., Macías Sáncez, J., Castro Díaz, M. J., de la Asunción, M., and Sánchez-Linares, C.: Combining High-Performance Computing and Neural Networks for Tsunami Maximum Height and Arrival Time Forecasts, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-11, https://doi.org/10.5194/egusphere-gc11-solidearth-11, 2023.

P9
|
GC11-solidearth-13
Manuel Stocchi, Silvia Massaro, Beatriz Martínz Montesinos, Laura Sandri, Jacopo Selva, Roberto Sulpizio, Biagio Giaccio, Massimiliano Moscatelli, Edoardo Peronace, Marco Nocentini, Roberto Isaia, Manuel Titos Luzón, Pierfrancesco Dellino, Giuseppe Naso, and Antonio Costa

The creation of hazard maps relative to volcanic phenomena requires taking into account the intrinsic complexity and variability of eruptions. Here we present an example of how HPC can allow producing a high resolution multi-source probabilistic hazard assessment due to tephra fallout over a domain covering Southern Italy.  

The three active volcanoes in the Neapolitan area, Somma-Vesuvius, Campi Flegrei and Ischia, were considered as volcanic sources. For each one, we explored  three explosive size classes (Small, Medium and Large) for Somma Vesuvius and Campi Flegrei, and one explosive size class (Large) for Ischia. For each size class, we performed 1500 numerical simulations of ash dispersion (total of 10500) using the  Fall3D (V8.0) model over a computational domain covering  Southern Italy with a 0.03° ⨉ 0.03° (~3 km ⨉ 3 km) resolution. Within each size class, the eruptive parameters have been randomly sampled from well-suited probability distributions and with different meteorological conditions, obtained by randomly sampling a day between 1990 and 2020 and retrieving the relative data from the ECMWF ERA5 database. This allowed exploring the intra-class variability and to quantify aleatoric uncertainty. The results of these simulations have been post-processed with a statistical approach by assigning a weight to each eruption (based on its eruption magnitude) and the annual eruption rate of each size class. For the case of Campi Flegrei, the variability in the eruptive vent position has also been explored by constructing a grid of possible vent locations with different spatial probability. By merging the results obtained for each source and size class we produced a portfolio of hazard maps showing the expected mean annual frequency of overcoming selected thresholds in  ground tephra load. A disaggregation analysis has also been performed in order to understand which particular source and/or size class had the greater impact on a particular area. 

The completion of this work, considering both numerical simulations and the statistical elaboration of the results has required a total of more than 5000 core hours and the processing of more than 2TB of data, an effort that wouldn’t have been possible without the access to high level HPC resources.

How to cite: Stocchi, M., Massaro, S., Martínz Montesinos, B., Sandri, L., Selva, J., Sulpizio, R., Giaccio, B., Moscatelli, M., Peronace, E., Nocentini, M., Isaia, R., Titos Luzón, M., Dellino, P., Naso, G., and Costa, A.: Ash fallout long term probabilistic volcanic hazard assessment for Neapolitan volcanoes: an example of what Earth Scientists can do with HPC resources, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-13, https://doi.org/10.5194/egusphere-gc11-solidearth-13, 2023.

P10
|
GC11-solidearth-17
|
ECS
Deepak Garg, Paolo Papale, Antonella Longo, and Chiara Montagna

We present a versatile open-source FEM-based multi-physics numerical code GALES for volcanic and general-purpose problems. The code is developed/applied to a suite of problems in magma and volcano dynamics. The software is written in modern C++ and is parallelized using OpenMPI and  Trilinos libraries. GALES comprises several advanced solvers for 2D and 3D problems dealing with heat transfer, compressible to incompressible mono and multi-fluid flows in Eulerian and Arbitrary Lagrangian-Eulerian (ALE) formulations, Elastic (static and dynamic) deformation of solids and fluid-solid interaction. Fluid solvers account for both Newtonian and non-Newtonian rheologies. Solvers account for transient as well as steady problems. Non-linear problems are linearized using Newton's method. All solvers have been thoroughly verified and validated on standard benchmarks. The software is regularly used for high-performance computing (HPC) on our local cluster machines at INGV, Pisa, Italy. Recently, we analyzed the performance of the code by a series of strong-scaling tests conducted on the Marenostrum supercomputer at the Barcelona Supercomputing Centre (BSC) up to 12288 cores. The results revealed a computational speedup close to ideal and above satisfactory levels as long as the element/core ratio is sufficiently large, making GALES an excellent choice for utilizing HPC resources efficiently for complex magma flow and rock dynamics problems.

How to cite: Garg, D., Papale, P., Longo, A., and Montagna, C.: GALES: a general-purpose multi-physics FEM code, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-17, https://doi.org/10.5194/egusphere-gc11-solidearth-17, 2023.

P11
|
GC11-solidearth-18
Carlos Sánchez-Linares, Jorge Macías, and Manuel J. Castro Díaz
In this work, we demonstrate the importance of High Performance Computing (HPC) and Urgent Computing (UC) in rapid disaster response to man-made catastrophes. Using the specific case study of a potential dam breach at the Kyiv cistern reservoir in Ukraine, we show how these technologies can be used to assess the potential hazards and impacts of such an event.
 
The Copernicus Emergency Management Service Risk & Recovery Mapping service was activated to derive hazard assessment mapping for the protection of citizens and infrastructure in the event of a potential dam breach. The Kiev Hydroelectric Power Plant, located upstream of Kiev, has been the target of a previous rocket attack, and the threat of another successful attack cannot be ruled out. The potential consequences of a dam breach are severe and include flooding and the erosion and transport of radioactive sediment. 
 
To determine the impact of a potential dam breach and predict its effects on citizens and infrastructure, we used the Dambreak-HySEA model, which is part of the HySEA suite. This model is specifically designed to accurately reproduce the evolution of wet/dry fronts in geophysical flows such as river flooding, flooding in rural and urban areas, and dam failures. Developed by the research group EDANYA of the University of Malaga, the model was implemented using Graphics Processor Units (GPUs) with CUDA, resulting in a significant speed-up compared to a traditional CPU implementation.
 
The use of HPC resources, specifically those provided by the Spanish Network for Supercomputing (RES) program Urgent computing for citizen problems, was crucial in obtaining the results within the 10-day delivery time frame with the simulation of four scenarios required by the activation: partial and complete breakage of the Kyiv Dam and the same scenarios for the Irpin Dam. These simulations were computationally complex, requiring the solution of nearly 700 million computational cells per scenario.
 
Overall, this study highlights the importance of utilizing HPC and UC in disaster response and risk management. By utilizing advanced numerical models and computational resources, we can more accurately predict the potential hazards and impacts of man-made catastrophes such as dam breaches, and take necessary measures to protect citizens and infrastructure.

How to cite: Sánchez-Linares, C., Macías, J., and Castro Díaz, M. J.: HPC in Rapid Disaster Response: Numerical simulations for hazard assessment of a potential dam breach of the Kyiv cistern reservoir, Ukraine, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-18, https://doi.org/10.5194/egusphere-gc11-solidearth-18, 2023.

P12
|
GC11-solidearth-19
|
ECS
Linus Walter, Francesco Parisio, Qingkai Kong, Sara Hanson-Hedgecock, and Víctor Vilarrasa

Reliable reservoir characterization of the strata, fractures, and hydraulic properties is needed to determine the energy storage capacity of geothermal systems. We apply the state-of-the-art Physics-Informed Neural Networks (PINN) to model subsurface flow in a geothermal reservoir. A PINN can incorporate any physical laws that can be described by partial differential equations. We obtain a ground truth dataset by running a virtual pumping-well test in the numerical code “Code_Bright”. This model consists of a low-permeability rock matrix, intersected by high-permeability fractures. We approximate the reservoir permeability with an Artificial Neural Network (ANN) denoted by . Secondly, we model the fluid pressure evolution with the PINN by informing it about the experimental well-testing data. Since observation wells are sparse in space (only the injection well in our case), we feed into a hydraulic mass balance equation. The residual of this equation enforces the loss function of for random collocation points inside the domain. Our results indicate that the ANN is able to approximate even for a high permeability contrast. In addition, the successful interpolation of proves the PINN is a promising method for matching field data with physical laws. In contrast to numerical models PINNs shift the computational efforts toward the training, while reducing the resources needed for the forward evaluation. Nevertheless, training a 3D reservoir model can hardly be achieved  on an ordinary workstation since the training data may include several millions of entries. In addition, computational costs increase due to the inclusion of multiphysics processes in the PINN. We plan to prepare the PINN model for training using parallelized GPUs to significantly increase the training speed of the ANNs.

How to cite: Walter, L., Parisio, F., Kong, Q., Hanson-Hedgecock, S., and Vilarrasa, V.: Physics-informed Neural Networks to Simulate Subsurface Fluid Flow in Fractured Media, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-19, https://doi.org/10.5194/egusphere-gc11-solidearth-19, 2023.

P13
|
GC11-solidearth-26
Thomas Zwinger, Denis Cohen, and Lasse Koskinen

The Olkiluoto spent nuclear fuel repository in Eurajoki, Finland, is the first one on the planet that will go operational in foreseeable future. The long-term safety of this repository with respect to future ice-age conditions and the consequently occurring permafrost and altered groundwater flow conditions needs to be evaluated. To this end, a Darcy model for saturated aquifer groundwater flow combined with a heat transfer module accounting for phase change (i.e. freezing) as well as a solute and a bedrock deformation model have been implemented in the multi-physics Finite Element Method code Elmer. The set of equations is based on continuum thermo-mechanic principles. The application of this newly developed model to Olkiluoto aims to simulate the evolution of permafrost thickness, talik development, and groundwater flow and salinity changes at and around the repository during the next 120,000 years. This is achieved by solving the aforementioned model components in a coupled way in three dimensions on the mesh that discretizes a rectangular block of 8.8 km by 6.8 km, stretching from the surface of Olkiluoto down to a depth of 10 km, where a geothermal heat flux is applied. The horizontal resolution of 30 m by 30 m in combination with – imposed by the thickness of different temporarily varying soil and rock layers imported from high resolution data - vertical resolutions of down to 10 cm result in a mesh containing 5 million nodes/elements on which the system of equations is solved using CSC’s HPC cluster mahti. The high spacial gradients in permeability (e.g. from soil to granitic bedrock) impose numerical challenges for the simulations that are forced by RCP 4.5 climate scenario. The investigated time-span contains cold periods between AD 47,000 and AD 110,000. Surface conditions are provided using freezing/thawing n-factors based on monthly temperature variations and wetness index defining varying conditions of vegetation. Our scenario run is able to project permafrost development at high spatial resolution and shows clear impact of permeable soil layers and faults in the bedrock that focus groundwater flow and solute transport.

How to cite: Zwinger, T., Cohen, D., and Koskinen, L.: Coupled permafrost-groundwater simulation applied to a spent fuel nuclear waste repository, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-26, https://doi.org/10.5194/egusphere-gc11-solidearth-26, 2023.

P14
|
GC11-solidearth-27
|
ECS
Haiqing Wu, Jian Xie, and Victor Vilarrasa

Fluid injection-induced earthquakes involve a series of complex physical processes. Evaluating these processes at the basin scale requires an amount of input data and a super computational ability to solve in near-real time risk analysis, which remains the most critical challenge in geo-energy related applications. Although the current computational tools can achieve a good simulation for the field scale problems, they are far away from the requirements of the basin scale analysis. Alternatively, we can apply verified analytical solutions of certain processes to speed the whole calculations when moving from the field to the basin scale. With this in mind, we adopt the analytical solutions for pore pressure diffusion and stress variations due to fluid injection into the reservoir. With the superposition principle, the analytical solutions can address the coupling problem of multi-injection wells at the basin scale. We then assess faults stability and the associated induced seismicity potential using the hydro-mechanical perturbations throughout the basin computed analytically.

To handle the uncertainty of geological properties, including the fault and reservoir geometries, hydraulic and mechanical properties, we perform Monte Carlo simulations to analyze their effects on induced seismicity potential. Such comprehensive parametric space analysis currently represents an insurmountable obstacle to be solved numerically, even calculating the problem in parallel. We propose a feasible methodology to mitigate the magnitude of induced seismicity, and even to avoid large earthquakes for subsurface energy-related projects, based on the results obtained both at the field and basin scales. This development will represent a great tool for risk evaluation of induced earthquakes not only in the period of site selection, but also in the whole lifetime of geo-energy projects. 

How to cite: Wu, H., Xie, J., and Vilarrasa, V.: Risk assessment and mitigation of induced seismicity for geo-energy related applications at the basin scale, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-27, https://doi.org/10.5194/egusphere-gc11-solidearth-27, 2023.

P15
|
GC11-solidearth-28
|
ECS
Alice Abbate, Manuela Volpe, Fabrizio Romano, Roberto Tonini, Gareth Davies, Roberto Basili, and Stefano Lorito

Local hazard models for evacuation planning should accurately describe the probability of exceeding a certain intensity (e.g. flow depth, current velocity, etc.) over a period of years. 

Computational-based probabilistic forecasting for earthquake-generated tsunamis deals with tens of thousands to millions of scenarios to be simulated over very large domains and with sufficient spatial resolution of the bathymetry model. The associated high computational cost can be tackled by means of  workflows that take advantage of HPC facilities and numerical models specifically designed for multi-GPU architectures.  

For the sake of feasibility, Seismic Probabilistic Tsunami Hazard Assessment (S-PTHA) at local scale exploits some approximations in both source and tsunami modeling, but  uncertainty  quantification is still lacking in the estimates. Here, we propose a possible approach to reduce the computational cost of local-scale S-PTHA, while providing uncertainty quantification. 

The algorithm performs an  efficient selection of  scenarios  based on the tsunami impact on a site. 

The workflow is thought to take advantage of parallel execution on HPC clusters. Hence, as a first step, the whole ensemble of scenarios is split into a finite number of regions defined by the tectonic regionalization;  then the procedure selects the scenarios mainly contributing to the hazard at an offshore point (in front of the target site) and for specific intensity  levels. Finally, for each intensity level, the totality of synthetic tsunamigenic earthquakes is optimally sampled with replacement in a Monte Carlo Importance Sampling scheme. 

The tsunamis potentially triggered by the selected scenarios are explicitly simulated with the GPU-based Tsunami-HySEA nonlinear shallow water code on high spatial resolution grids (up to 10 m) and subsequently the Monte Carlo errors are finally propagated to the onshore estimates. 

This procedure allows for lessening the computational cost of local S-PTHA by reducing the number of simulations to be conducted while quantifying the epistemic uncertainties associated with the inundation modeling without appreciable losses of information content.

How to cite: Abbate, A., Volpe, M., Romano, F., Tonini, R., Davies, G., Basili, R., and Lorito, S.: Optimal source selection for local probabilistic tsunami hazard analysis, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-28, https://doi.org/10.5194/egusphere-gc11-solidearth-28, 2023.

P16
|
GC11-solidearth-29
Natalia Zamora, Jorge León, Alejandra Gubler, and Patricio Catalán

Tsunami evacuation planning can be crucial to mitigate the impact on lives. During evacuation procedures, vertical evacuation can be an effective way to provide protection to people if horizontal evacuation is not feasible. However, this can imply an associated risk if the different scenarios related not only to the uncertainties of the tsunami phenomena, but also to the behavior of people during an evacuation phase are not considered. For this reason, in recent years, tsunami risk management in Chile has incorporated the propagation of uncertainties in each phase of the study of tsunami impacts and the design of evacuation routes. Agent-based models allow coupling inundation tsunami scenarios and the people's interactions and decision-making. In this research, thousands of tsunami scenarios are considered to establish tsunami hazard mapping based on flow depths and tsunami time arrivals. We chose a worst-case scenario from this database and coupled it with an agent-based model to assess tsunami evacuation in Viña del Mar, Chile. Moreover, we examined an improved situation with the same characteristics, but including 11 tsunami vertical-evacuation (TVE) facilities. Our findings show that the tsunami flood might lead to significant human casualties in the case of a worst-case scenario (above 50% of the agents). Nevertheless, including the TVE structures could reduce this number by roughly 10%. Future work will include propagation of uncertainties also in all the phases of the evacuation where HPC will aid on the simulations of agent-based models that require intense computational resources.

How to cite: Zamora, N., León, J., Gubler, A., and Catalán, P.: Tsunami evacuation using an agent-based model in Chile, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-29, https://doi.org/10.5194/egusphere-gc11-solidearth-29, 2023.

P17
|
GC11-solidearth-30
|
ECS
Alejandro Gonzalez del Pino, Marta Fernández, Miguel Llorente, Jorge Macías, Julián García-Mayordomo, and Carlos Paredes

Tsunamis are low-probability phenomena with high-risk potential. Lack of field data emphasizes the need of using simulation software to model the potential devastating effects of a tsunami and use this information to develop safety, sustainable actions and social resilience for the future. These measures may include, among many others, spatial planning; designing of evacuation routes; or the allocation of economic resources through insurance or other instruments to mitigate tsunami impacts. Our work introduces a Monte Carlo-like method for simulating the potential impact of tsunamis on the Spanish coastlines, specifically in the provinces of Huelva and Cádiz for the Atlantic region, and Balearic Islands, Ceuta, Melilla and eastern Iberian coast for the Mediterranean region. The method introduces a pseudo-probabilistic seismic-triggered tsunami simulation approach, by considering a particular selection of active faults with associated probabilistic distributions for some of the source parameters, and a Sobol’s sequences-based sampling strategy to generate a synthetic seismic catalogue. All roughly 4000 crafted seismic events are simulated along the areas of interest in high-resolution grids (five meters pixel resolution) using a two-way nested mesh approach, retrieving maximum water height, maximum mass flow and maximum modulus of the velocity at each grid cell. These numerical simulations are computed in a GPU environment, harnessing resources allocated in several high-performance computing (HPC) centres. HPC infrastructures play a crucial role in the computing aspect of the project, as the calculation power required to complete full-fledge high-resolution tsunami simulations in a reasonably time is expensive. The numerical database of retrieved variables generated throughout this study offers an excellent foundation for evaluating various tsunami-related hazards and risks.

The final resulting product focuses on generating frequency distributions for the economic impacts for the Spanish insurance sector (Consorcio de Compensación de Seguros, CCS). The CCS is a public-private entity insuring most natural catastrophic events in Spain. A consistent spatially-distributed economic database regarding insurance building-related values has been constructed and aggregated in conjunction with the numerical tsunami simulations. The proposed procedure allows to associate an economic impact indicator to each source. Further statistical analysis of the economic impact estimators yields to varied conclusions such as an improved definition of worst-case scenario (effect-based rather than worst-triggered), most and least likely economic impact, highest hazardous fault sources overall and locally and many others.

How to cite: Gonzalez del Pino, A., Fernández, M., Llorente, M., Macías, J., García-Mayordomo, J., and Paredes, C.: Exhaustive High-Performance Computing utilization in the estimation of the economic impact of tsunamis on Spanish coastlines, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-30, https://doi.org/10.5194/egusphere-gc11-solidearth-30, 2023.

P18
|
GC11-solidearth-32
Victor Vilarrasa, Haiqing Wu, Iman Vaezi, Sri Kalyan Tangirala, Auregan Boyet, Silvia De Simone, and Iman R. Kivi

Geo-energies, such as geothermal energy, geologic carbon storage, and subsurface energy storage, will play a relevant role in reaching carbon neutrality and allowing net-carbon removal towards the midcentury. Geo-energies imply fluid injection into and/or production from the subsurface, which alter the initial effective stress state and may destabilize fractures and faults, thereby inducing seismicity. Understanding the processes that control induced seismicity is paramount to develop reliable forecasting tools to manage induced earthquakes and keep them below undesired levels. Accurately modeling the processes that occur during fracture/fault slip leading to induced seismicity is challenging because coupled thermo-hydro-mechanical-chemical (THMC) processes interact with each other: (1) fluid injection causes pore pressure buildup that changes total stress and deforms the rock, (2) deformation leads to permeability changes that affect pore pressure diffusion, (3) fluids reach the injection formation at a colder temperature than that of the rock, which cools down the vicinity of the well, causing changes in the fluid properties (density, viscosity, enthalpy, heat capacity) and cooling-induced stress reduction, (4) the injected fluids are not in chemical equilibrium with the host rock, leading to geochemical reactions of mineral dissolution/precipitation that may alter rock properties, in particular, the shear strength. In the framework of GEoREST (www.georest.eu), a Starting Grant from the European Research Council (ERC), we aim at developing forecasting tools for injection-induced seismicity by developing methodologies to efficiently simulate the coupled THMC processes that occur as a result of fluid injection, which allows us to improve the understanding of the mechanisms that trigger induced seismicity. To this end, we use the fully coupled finite element method software CODE_BRIGHT, which includes capabilities like friction following the Mohr-Coulomb failure criterion with strain weakening and dilatancy, enabling simulations of fracture/fault reactivation. Our investigations have already contributed to the understanding of the processes that induced the seismicity at the Enhanced Geothermal System (EGS) at Basel, Switzerland, at the Castor Underground Gas Storage, Spain, and the reservoir-induced seismicity at Nova Ponte, Brazil. To achieve scalability and speed up the calculations to eventually manage induced seismicity in real time, we intend to incorporate efficient state-of-the-art linear solvers, like HYPRE and PETSc, in CODE_BRIGHT.

How to cite: Vilarrasa, V., Wu, H., Vaezi, I., Tangirala, S. K., Boyet, A., De Simone, S., and Kivi, I. R.: Numerical simulation of injection-induced seismicity, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-32, https://doi.org/10.5194/egusphere-gc11-solidearth-32, 2023.

P19
|
GC11-solidearth-33
Leonardo Mingari, Arnau Folch, Alejandra Guerrero, Sara Barsotti, Talfan Barnie, Giovanni Macedonio, and Antonio Costa

A Digital Twin Component (DTC) provides users with digital replicas of different components of the Earth system through unified frameworks integrating real-time observations and state-of-the-art numerical models. Scenarios of extreme events for natural hazards can be studied from the genesis to propagation and impacts using a single DTC or multiple coupled DTCs. The EU DT-GEO project (2022-2025) is implementing a prototype digital twin on geophysical extremes consisting of 12 interrelated Digital Twin Components, intended as self-contained and containerised software entities embedding numerical model codes, management of real-time data streams and data assimilation methodologies. DTCs can be deployed and executed in centralized High Performance Computing (HPC) and cloud computing Research Infrastructures (RIs). In particular, the DTC-V2 is implementing an ensemble-based automated operational system for deterministic and probabilistic forecast of long-range ash dispersal and local-scale tephra fallout. The system continuously screens different ground-based and satellite-based data sources and a workflow is automatically triggered by a volcanic eruption to stream and pre-process data, its ingestion into the FALL3D dispersal model, a centralized or distributed HPC model execution, and the post-processing step. The DTCs will provide capability for analyses, forecasts, uncertainty quantification, and "what if" scenarios for natural and anthropogenic hazards, with a long-term ambition towards the Destination Earth mission-like initiative. 

How to cite: Mingari, L., Folch, A., Guerrero, A., Barsotti, S., Barnie, T., Macedonio, G., and Costa, A.: A digital twin component for volcanic dispersal and fallout, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-33, https://doi.org/10.5194/egusphere-gc11-solidearth-33, 2023.

P20
|
GC11-solidearth-36
|
ECS
Andrea C. Riaño, Juan C. Reyes, Jacobo Bielak, Doriam Restrepo, Ricardo Taborda, and Luis E. Yamin

The basin beneath the greater metropolitan area of Bogotá, Colombia, consists of soft material deposits with shear wave velocity Vs ≤ 400 m/s that reach depths up to 425 m. Located on a high plateau in the eastern cordillera of the Colombian Andes, this highly populated urban area is subject to significant seismic hazards from local and regional fault systems. The potential ground motion amplification effects during earthquakes due to the presence of soft soil deposits and the surface and sub-surface topography constitute problems of great importance towards better understanding and estimating the seismic risk of the city. Given the scarcity of seismic data from large magnitude events, and in an effort to advance modern seismic hazard mapping for the region, this study aimed to develop a physics-based framework to generate synthetic ground records that can help to better understand the basin and other amplification effects during strong earthquake shaking in the region, and then to incorporate these effects into the estimation of seismic risk. To this end, a set of simulations were first conducted on Hercules, the wave propagation octree-based finite element simulator developed by the Quake Group at Carnegie Mellon University, to reproduce similar conditions to those observed in Bogotá during past seismic events (e.g., 2008 Quetame Earthquake) and to identify the impacts of hypothetical strong earthquakes scenarios. Then the results from these simulations were then integrated into a new software package for post-processing and assessing the seismic risk in the Bogotá region for different scenarios selected.

How to cite: Riaño, A. C., Reyes, J. C., Bielak, J., Restrepo, D., Taborda, R., and Yamin, L. E.: Integrating 3D physics-based earthquake simulations to seismic risk assessment: The case of Bogotá, Colombia., Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-36, https://doi.org/10.5194/egusphere-gc11-solidearth-36, 2023.

P21
|
GC11-solidearth-46
|
ECS
Ebissa Kedir, Chandra Ojha, and Hari Prasad

In the present study, the depth-averaged velocity and boundary shear stress in non-prismatic compound channels with three different converging floodplain angles ranging from 1.43ᶱ to 7.59ᶱ have been studied. The analytical solutions were derived by considering acting forces on the channel beds and walls. In the present study, five key parameters, i. e non-dimensional coefficient, secondary flow term, secondary flow coefficient, friction factor, and dimensionless eddy viscosity, were considered and discussed. A new expression for non-dimensional coefficient and integration constants were derived based on the novel boundary conditions. The model was applied to different data sets of the present experiments and experiments from other sources, respectively, to examine and analyse the influence of floodplain converging angles on depth-averaged velocity and boundary shear stress distributions. The results show that the non-dimensional parameter plays an important in portraying the variation of depth-averaged velocity and boundary shear stress distributions with different floodplain converging angles. Thus, the variation of the non-dimensional coefficient needs attention since it affects the secondary flow term and secondary flow coefficient in both the main channel and floodplains. The analysis shows that the depth-averaged velocities are sensitive to a shear stress-dependent model parameter non-dimensional coefficient, and the analytical solutions are well agreed with experimental data when five parameters are included. It is inferred that the developed model may facilitate the interest of others in complex flow modeling.
Keywords: Depth-average velocity, Converging floodplain angles, Non-dimensional coefficient, Non-prismatic compound Channels

How to cite: Kedir, E., Ojha, C., and Prasad, H.: Modeling Depth averaged velocity and Boundary Shear Stress distribution with complex flows, Galileo Conference: Solid Earth and Geohazards in the Exascale Era, Barcelona, Spain, 23–26 May 2023, GC11-solidearth-46, https://doi.org/10.5194/egusphere-gc11-solidearth-46, 2023.