EGU23-7751
https://doi.org/10.5194/egusphere-egu23-7751
EGU General Assembly 2023
© Author(s) 2023. This work is distributed under
the Creative Commons Attribution 4.0 License.

A Coarse-to-Fine Deep Learning Framework for High-Resolution Future Precipitation Map Generation

Shan Zhao, Zhitong Xiong, and Xiao Xiang Zhu
Shan Zhao et al.
  • Data Science in Earth Observation, Technical University of Munich, Ottobrunn, Germany (shan.zhao@tum.de)

Precipitation nowcasting, aiming to predict the rainfall intensity in the near future (usually 0-2h) [1], is crucial for urban planning, flood monitoring, agriculture management, and so on. Numerical weather modeling (NWP) takes a variety of data sources as the input of complex computer models that use mathematical equations to simulate the behavior of the atmosphere. Limited by the time needed for model spin-up, the performance in the short near future is not satisfactory. Deep learning (DL)-based method fills in the gap by treating nowcasting as a video prediction problem. The Convolutional LSTM [2] extracts spatial information when dealing with temporal series. The Generated Adversarial Network (GAN)-based [3] method shows potential in simulating the realisticness of the precipitation field. However, training such a model is very time-consuming and data-demanding [3] [4]. Different from natural images, the precipitation field to be estimated usually has a larger spatial size. Moreover, the convolutional layers tend to oversmooth the output and eliminate the small patterns that are important for the meteorologists to make the decision. Thus, we proposed a two-stage framework: the first one is to train an RNN-based model on the coarse field. The second is to downscale and style transfer from the coarse field to high-resolution precipitation maps based on GAN and Graph Convolutional Network (GCN). The coarse prediction will act as a constraint to the finer scale output and allows re-assignment of the spatial distribution of intensities. Such probabilistic output prevents the overestimation of the intensity. RNN is good at capturing long-range characteristics, and GCN [5] can extract local and neighborhood information, thus these two channels are naturally complementary to improve both local patterns and global accuracy scores. The GAN is used to make final output similar to real precipitation maps such as radar scans. To train the model, we downloaded the 2006-2016 ERA5 total precipitation at 0.25-degree spatial resolution and the DWD radar map [6] at 1km spatial resolution. We expect our model can capture the overall coverage of rainfall events and depict the spatial details. More importantly, this alleviates the data shortage problem, i.e., high-resolution precipitation nowcasting at places without ground-based radar stations can be acquired.

 

[1] Shi, Xingjian, et al. "Deep learning for precipitation nowcasting: A benchmark and a new model." Advances in neural information processing systems 30 (2017).

[2]Shi, Xingjian, et al. "Convolutional LSTM network: A machine learning approach for precipitation nowcasting." Advances in neural information processing systems 28 (2015).

[3] Ravuri, Suman, et al. "Skilful precipitation nowcasting using deep generative models of radar." Nature 597.7878 (2021): 672-677.

[4] Sønderby, Casper Kaae, et al. "Metnet: A neural weather model for precipitation forecasting." arXiv preprint arXiv:2003.12140 (2020).

[5] Shi, Yilei, Qingyu Li, and Xiao Xiang Zhu. "Building segmentation through a gated graph convolutional neural network with deep structured feature embedding." ISPRS Journal of Photogrammetry and Remote Sensing 159 (2020): 184-197.

[6] Ayzel, Georgy, Tobias Scheffer, and Maik Heistermann. "RainNet v1. 0: a convolutional neural network for radar-based precipitation nowcasting." Geoscientific Model Development 13.6 (2020): 2631-2644.

How to cite: Zhao, S., Xiong, Z., and Zhu, X. X.: A Coarse-to-Fine Deep Learning Framework for High-Resolution Future Precipitation Map Generation, EGU General Assembly 2023, Vienna, Austria, 24–28 Apr 2023, EGU23-7751, https://doi.org/10.5194/egusphere-egu23-7751, 2023.