EGU General Assembly 2023
© Author(s) 2023. This work is distributed under
the Creative Commons Attribution 4.0 License.

Physics-Constrained Deep Learning for Downscaling

Paula Harder1,2,3, Venkatesh Ramesh2,4, Alex Hernandez-Garcia2,4, Qidong Yang2,5, Prasanna Sattigeri6, Daniela Szwarcman7, Campbell Watson6, and David Rolnick2,8
Paula Harder et al.
  • 1Fraunhofer ITWM, Kaiserslautern, Germany
  • 2Mila Quebec AI Institute, Montreal, Canada
  • 3University of Kaiserlautern, Kaiserslautern, Germany
  • 4University of Montreal, Montreal, Canada
  • 5New York University, New York, US
  • 6IBM Research US, New York, US
  • 7IBM Research Brazil, Brazil
  • 8McGill University, Montreal, Canada

The availability of reliable, high-resolution climate and weather data is important to inform long-term decisions on climate adaptation and mitigation and to guide rapid responses to extreme events. Forecasting models are limited by computational costs and, therefore, often generate coarse-resolution predictions. Statistical downscaling can provide an efficient method of upsampling low-resolution data. In this field, deep learning has been applied successfully, often using image super-resolution methods from computer vision. However, despite achieving visually compelling results in some cases, such models frequently violate conservation laws when predicting physical variables. In order to conserve physical quantities, we develop methods that guarantee physical constraints are satisfied by a deep learning downscaling model while also improving their performance according to traditional metrics. We compare different constraining approaches and demonstrate their applicability across different neural architectures as well as a variety of climate and weather data sets, including ERA5 and WRF data sets.

How to cite: Harder, P., Ramesh, V., Hernandez-Garcia, A., Yang, Q., Sattigeri, P., Szwarcman, D., Watson, C., and Rolnick, D.: Physics-Constrained Deep Learning for Downscaling, EGU General Assembly 2023, Vienna, Austria, 24–28 Apr 2023, EGU23-4350,, 2023.

Supplementary materials

Supplementary material file