EGU26-15780, updated on 14 Mar 2026
https://doi.org/10.5194/egusphere-egu26-15780
EGU General Assembly 2026
© Author(s) 2026. This work is distributed under
the Creative Commons Attribution 4.0 License.
Oral | Wednesday, 06 May, 12:10–12:20 (CEST)
 
Room 2.31
A Structure-Preserving Neural Flux Surrogate for Efficient Shallow-Water Modelling
Jingxiao Wu, Qiuhua Liang, and Huili Chen
Jingxiao Wu et al.
  • School of Architecture, Building and Civil Engineering, Loughborough University, Loughborough, United Kingdom (j.wu@lboro.ac.uk)

High-resolution shallow-water (SW) models are critical for flood and inundation forecasting, yet their operational efficiency is often bottlenecked by the computational cost of repeatedly solving intercell Riemann problems. While emerging machine-learning surrogates (e.g., PINNs and neural operators) can accelerate PDE prediction, they often struggle to meet the rigorous requirements of hydrodynamic modelling. Specifically, these end-to-end models generally enforce physics only via soft constraints, leading to non-physical mass leakage and error accumulation over long durations. They also suffer from spectral bias, which hinders the sharp capture of discontinuities and wet–dry fronts. Furthermore, they typically lack cross-geometry generalizability, requiring costly retraining when boundary conditions or mesh resolutions change. This study proposes a structure-preserving hybrid strategy that integrates deep learning into a classical Godunov-type finite-volume (FV) solver. Rather than approximating the global solution map, we employ a neural network as a local, plug-in surrogate specifically for intercell flux evaluation. This network learns a discretization-aware operator, mapping local reconstructed interface states to normal-aligned numerical fluxes. Crucially, by embedding this learned surrogate within the standard FV backbone—retaining CFL-controlled time marching and wetting–drying treatments—the hybrid solver strictly enforces mass conservation through rigorous flux-difference assembly. Because the model learns local interface physics rather than global flow patterns, it exhibits strong cross-resolution generalization: a model trained on a specific grid can be deployed directly on different mesh densities and unseen initial conditions without retraining. This work establishes a scalable pathway for integrating deep learning into hydrodynamic solvers, combining the computational speed of machine learning with the reliability and conservation properties of numerical mechanics.

How to cite: Wu, J., Liang, Q., and Chen, H.: A Structure-Preserving Neural Flux Surrogate for Efficient Shallow-Water Modelling, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-15780, https://doi.org/10.5194/egusphere-egu26-15780, 2026.