- 1Peking University, Shenzhen Graduate School, School of Environment and Energy, (caoxiaoyan@stu.pku.edu.cn)
- 2Tsinghua University, Tsinghua-Berkeley Shenzhen Institute
Over the past thirty years, floods have been the most frequent natural disaster globally, affecting billions of people and causing trillions of dollars in losses. Fast, accurate, high-resolution forward and inverse flood modeling is urgently required to mitigate the socioeconomic effects. Although machine learning offers fast flood simulations, high-resolution spatiotemporal modeling is fundamentally constrained by data scarcity and physical inconsistencies. Here we present a generative physical distillation neural network (GPDNN) that distills physical laws into neural networks via multi-path parallel generation, and we theoretically prove that it can approximate universal flood dynamic systems without observations. GPDNN is the first unified flood modeling network that supports forward forecasting (i.e., seen event extrapolation and unseen event generalization) and inverse parameter estimation for common flood types. Specifically, GPDNN is the first observation-free machine learning method that provides near-instantaneous and globally physically consistent flood dynamics forward forecast up to 24 h ahead at high spatiotemporal resolution. Simulations of dam-break floods, riverine floods, and urban inundation are found to incur errors that are one to two orders of magnitude lower than existing methods based on dense observations. Furthermore, GPDNN has high accuracy and robustness in inverse analyses of hydraulic parameters under both spatially homogeneous and heterogeneous conditions with sparse or even noisy observations. Our work has knock-on benefits for flood risk assessment, forecasting, and management.
How to cite: Cao, X., Yao, Y., and Qin, H.: Unified flood modeling with generative physical distillation neural network under data scarcity, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-21913, https://doi.org/10.5194/egusphere-egu26-21913, 2026.