EGU22-8499
https://doi.org/10.5194/egusphere-egu22-8499
EGU General Assembly 2022
© Author(s) 2022. This work is distributed under
the Creative Commons Attribution 4.0 License.

Matryoshka Neural Operators: Learning Fast PDE Solvers for Multiscale Physics

Björn Lütjens1, Catherine H. Crawford2, Campbell Watson2, Chris Hill3, and Dava Newman1,4
Björn Lütjens et al.
  • 1Human Systems Laboratory, Department of Aeronautics and Astronautics, MIT, (lutjens@mit.edu)
  • 2Future of Climate, IBM Research
  • 3Department of Earth, Atmospheric and Planetary Sciences, MIT
  • 4MIT Media Lab

Running a high-resolution global climate model can take multiple days on the world's largest supercomputers. Due to the long runtimes that are caused by solving the underlying partial differential equations (PDEs), climate researchers struggle to generate ensemble runs that are necessary for uncertainty quantification or exploring climate policy decisions.

 

Physics-informed neural networks (PINNs) promise a solution: they can solve single instances of PDEs up to three orders of magnitude faster than traditional finite difference numerical solvers. However, most approaches in physics-informed machine learning learn the solution of PDEs over the full spatio-temporal domain, which requires infeasible amounts of training data, does not exploit knowledge of the underlying large-scale physics, and reduces model trust. Our philosophy is to limit learning to the hard-to-model parts. Hence, we are proposing a novel method called \textit{matryoshka neural operator} that leverages an old scheme called super-parametrizations developed in geophysical fluid dynamics. Using this scheme our proposed physics-informed architecture exploits knowledge of approximate large-scale dynamics and only learns the influence of small-scale dynamics onto large-scale dynamics, also called subgrid parametrizations.

 

Some work in geophysical fluid dynamics is conceptually similar, but fully relies on neural networks which can only operate on fixed grids (Gentine et al., 2018). We are the first to learn grid-independent subgrid parametrizations by leveraging neural operators that learn the dynamics in a grid-independent latent space. Neural operators can be seen as an extension of neural networks to infinite-dimensions: They encode infinite-dimensional inputs into a finite-dimensional representations, such as Eigen or Fourier modes, and learn the nonlinear temporal dynamics in the encoded state.

 

We demonstrate the neural operators for learning non-local subgrid parametrizations over the full large-scale domain of the two-scale Lorenz96 equation. We show that the proposed learning-based PDE solver is grid-independent, has quasilinear instead of quadratic complexity in comparison to a fully-resolving numerical solver, is more accurate than current neural network or polynomial-based parametrizations, and offers interpretability through Fourier modes.

 

Gentine, P., Pritchard, M., Rasp, S., Reinaudi, G., and Yacalis, G. (2018). Could machine learning break the convection parameterization deadlock? Geophysical Research Letters, 45, 5742– 5751. https://doi.org/10.1029/2018GL078202

How to cite: Lütjens, B., Crawford, C. H., Watson, C., Hill, C., and Newman, D.: Matryoshka Neural Operators: Learning Fast PDE Solvers for Multiscale Physics, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-8499, https://doi.org/10.5194/egusphere-egu22-8499, 2022.