EGU22-5739
https://doi.org/10.5194/egusphere-egu22-5739
EGU General Assembly 2022
© Author(s) 2022. This work is distributed under
the Creative Commons Attribution 4.0 License.

Deep learning for surrogate modeling of two-dimensional mantle convection

Siddhant Agarwal1,2,3, Nicola Tosi1, Pan Kessel2, Doris Breuer1, and Grégoire Montavon2
Siddhant Agarwal et al.
  • 1German Aerospace Center (DLR), Planetary Physics, Berlin, Germany
  • 2Berlin Institute of Technology, Machine Learning Group, Berlin, Germany
  • 3Helmholtz Einstein International Berlin Research School in Data Science (HEIBRiDS)

Mantle convection plays a fundamental role in the long-term thermal evolution of terrestrial planets like Earth, Mars, Mercury and Venus. The buoyancy-driven creeping flow of silicate rocks in the mantle is modeled as a highly viscous fluid over geological time scales and quantified using partial differential equations (PDEs) for conservation of mass, momentum and energy. Yet, key parameters and initial conditions to these PDEs are poorly constrained and often require a large sampling of the parameter space to find constraints from observational data. Since it is not computationally feasible to solve hundreds of thousands of forward models in 2D or 3D, some alternatives have been proposed. 

The traditional alternative to high-fidelity simulations has been to use 1D models based on scaling laws. While computationally efficient, these are limited in the amount of physics they can model (e.g., depth-dependent material properties) and predict only mean quantities such as the mean mantle temperature. Hence, there has been a growing interest in machine learning techniques to come up with more advanced surrogate models. For example, Agarwal et al. (2020) used feedforward neural networks (FNNs) to reliably predict the evolution of entire 1D laterally averaged temperature profile in time from five parameters: reference viscosity, enrichment factor for the crust in heat producing elements, initial mantle temperature, activation energy and activation volume of the diffusion creep. 

We extend that study to predict the full 2D temperature field, which contains more information in the form of convection structures such as hot plumes and cold downwellings. This is achieved by training deep learning algorithms on a data set of 10,525 2D simulations of the thermal evolution of the mantle of a Mars-like planet. First, we use convolutional autoencoders to compress the size of each temperature field by a factor of 142. Second,  we compare the use of two algorithms for predicting the compressed (latent) temperature fields: FNNs and long-short-term memory networks (LSTMs).  On the one hand, the FNN predictions are slightly more accurate with respect to unseen simulations (99.30%  vs. 99.22% for the LSTM). On the other hand, Proper orthogonal decomposition (POD) of the LSTM and FNN predictions shows that despite a lower mean relative accuracy, LSTMs capture the flow dynamics better than FNNs. The POD coefficients from FNN predictions sum up to 96.51% relative to the coefficients of the original simulations, while for LSTMs this metric increases to 97.66%. 

We conclude the talk by stating some strengths and weaknesses of this approach, as well as highlighting some ongoing research in the broader field of fluid dynamics that could help increase the accuracy and efficiency of such parameterized surrogate models.

How to cite: Agarwal, S., Tosi, N., Kessel, P., Breuer, D., and Montavon, G.: Deep learning for surrogate modeling of two-dimensional mantle convection, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-5739, https://doi.org/10.5194/egusphere-egu22-5739, 2022.