EGU24-20636, updated on 11 Mar 2024
https://doi.org/10.5194/egusphere-egu24-20636
EGU General Assembly 2024
© Author(s) 2024. This work is distributed under
the Creative Commons Attribution 4.0 License.

Can Attention Models Surpass LSTM in Hydrology?

Jiangtao Liu, Chaopeng Shen, and Tadd Bindas
Jiangtao Liu et al.
  • The Pennsylvania State University, United States of America (jql6620@psu.edu)

Accurate modeling of various hydrological variables is important for water resource management, flood forecasting, and pest control. Deep learning models, especially Long Short-Term Memory (LSTM) models based on Recurrent Neural Network (RNN) structures, have shown significant success in simulating streamflow, soil moisture, and model parameter assessment. With the development of large language models (LLMs) based on attention mechanisms, such as ChatGPT and Bard, we have observed significant advancements in fields like natural language processing (NLP), computer vision (CV), and time series prediction. Despite achieving advancements across various domains, the application of attention-based models in hydrology remains relatively limited, with LSTM models maintaining a dominant position in this field. This study evaluates the performance of 18 state-of-the-art attention-based models and their variants in hydrology. We focus on their performance in streamflow, soil moisture, snowmelt, and dissolved oxygen (DO) datasets, comparing them to LSTM models in both long-term and short-term regression and forecasting. We also examine these models' performance in spatial cross-validation. Our findings indicate that while LSTM models maintain strong competitiveness in various hydrological datasets, Attention models offer potential advantages in specific metrics and time lengths, providing valuable insights into applying attention-based models in hydrology. Finally, we discuss the potential applications of foundation models and how these methods can contribute to the sustainable use of water resources and the challenges of climate change.

How to cite: Liu, J., Shen, C., and Bindas, T.: Can Attention Models Surpass LSTM in Hydrology?, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-20636, https://doi.org/10.5194/egusphere-egu24-20636, 2024.