- 1University of Lausanne, IDYST, Lausanne, Switzerland (guillaume.jouvet@unil.ch)
- 2Friedrich-Alexander-University
- 3University of Bristol
Modeling future glacier evolution is traditionally divided into two steps. The first step, known as inverse modeling, involves estimating "hidden" variables (such as distributed ice thickness) to ensure the model aligns with available observations, thereby reconstructing the current full subglacial and supraglacial states. The second step, forward modeling, uses the initial state derived from the inverse model and simulates its evolution under a given climate forcing. Despite advancements in data availability to constrain inverse modeling and improvements in the representation of physical processes, several challenges persist. These include artificial biases and uncertainties, such as the "initial shock" problem, which arises when ice flow and surface mass balance are not physically balanced. These issues stem largely from the difficulty of reconciling hypothesized physical equations with observational data, especially as the volume of available data increases.
Generalized automatic differentiation (the main tool that permits deep learning) and advancements in parallel computing present unprecedented opportunities to address these challenges. By enabling the exploration of large spaces of glacier states, these technologies make it possible to identify states consistent with both ice physics and observations -- an approach that is computationally infeasible with traditional data assimilation methods. More specifically, physics-informed deep learning is a powerful tool that has the capability to merge the two essential steps (data assimilation by inverse modeling, and forward modeling) into a single framework. In this framework, both observational data as well as physics are seen as constraints that can be enforced in a similar manner, allowing for the discovery of composite, consistent solutions. Most importantly, the intrinsic structure of neural networks makes them highly efficient for computational tasks on GPUs as needed for global modeling, where traditional CPU-based solvers suffer from computational bottlenecks.
This work reviews the latest advancements in the Instructed Glacier Model (IGM), a next-generation glacier evolution model leveraging automatic differentiation and physics-informed deep learning to simulate ice flow and topographical change. IGM, a Python-based framework, integrates ice thermomechanics, surface mass balance, and mass conservation while emphasizing user accessibility, modularity, and reproducibility. The model takes benefit of recent libraries and tools such as: i) TensorFlow for high computational efficiency on GPUs and effective data assimilation, ii) OGGM for enhanced data accessibility, and iii) Hydra for streamlined configuration management. We demonstrate IGM’s versatility through applications in paleo and contemporary glacier modeling. Finally, we illustrate the added value of merging inverse and forward modeling into a unified framework. This approach reduces uncertainties by bridging the gap between data and physics, addressing the persistent challenge of inferring spatially distributed ice thickness while ensuring alignment with observed ice flow.
How to cite: Jouvet, G., Cook, S., Finley, B., Leger, T., and Maussion, F.: Unified forward and inverse glacier modeling with IGM, EGU General Assembly 2025, Vienna, Austria, 27 Apr–2 May 2025, EGU25-7262, https://doi.org/10.5194/egusphere-egu25-7262, 2025.