Two Methods for Constraining Neural Differential Equations
- 1Technical University of Munich, Munich, Germany
- 2Potsdam Institute for Climate Impact Research, Potsdam, Germany
- 3Helmholtz AI, Helmholtz Munich, Munich, Germany
- 4Department of Mathematics and Global Systems Institute, University of Exeter, Exeter, UK
Neural differential equations (NDEs) provide a powerful and general framework for interfacing machine learning with numerical modeling. However, constraining NDE solutions to obey known physical priors, such as conservation laws or restrictions on the allowed state of the system, has been a challenging problem in general. We present stabilized NDEs (SNDEs) [1], the first method for imposing arbitrary explicit constraints in NDE models. Alongside robust theoretical guarantees, we demonstrate the effectiveness of SNDEs across a variety of settings and using diverse classes of constraints. In particular, SNDEs exhibit vastly improved generalization and stability compared to unconstrained baselines. Building on this work, we also present constrained NDEs (CNDEs), a novel and complementary method with fewer hyperparameters and stricter constraints. We compare and contrast the two methods, highlighting their relative merits and offering an intuitive guide to choosing the best method for a given application.
[1] Alistair White, Niki Kilbertus, Maximilian Gelbrecht, Niklas Boers. Stabilized neural differential equations for learning dynamics with explicit constraints. In Advances in Neural Information Processing Systems, 2023.
How to cite: White, A., Kilbertus, N., Gelbrecht, M., and Boers, N.: Two Methods for Constraining Neural Differential Equations, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-16149, https://doi.org/10.5194/egusphere-egu24-16149, 2024.
Comments on the supplementary material
AC: Author Comment | CC: Community Comment | Report abuse