Potentials and challenges of using Explainable AI for understanding atmospheric circulation
- 1Know-Center, Research Center for Data-Driven Business and Artificial Intelligence, Graz, Austria
- 2Department of Geography and Regional Sciences, University of Graz, Graz, Austria
- 3Institute of Interactive Systems and Data Science, Graz University of Technology, Graz, Austria
Machine Learning (ML) and AI techniques, especially methods based on Deep Learning, have long been considered as black boxes that might be good at predicting, but not explaining predictions. This has changed recently, with more techniques becoming available that explain predictions by ML models – known as Explainable AI (XAI). These have seen adaptation also in climate science, because they could have the potential to help us in understanding the physics behind phenomena in geoscience. It is, however, unclear, how large that potential really is, and how these methods can be incorporated into the scientific process. In our study, we use the exemplary research question of which aspects of the large-scale atmospheric circulation affect specific local conditions. We compare the different answers to this question obtained with a range of different methods, from the traditional approach of targeted data analysis based on physical knowledge (such as using dimensionality reduction based on physical reasoning) to purely data-driven and physics-unaware methods using Deep Learning with XAI techniques. Based on these insights, we discuss the usefulness and potential pitfalls of XAI for understanding and explaining phenomena in geosciences.
How to cite: Scher, S., Trügler, A., and Abermann, J.: Potentials and challenges of using Explainable AI for understanding atmospheric circulation, EGU General Assembly 2023, Vienna, Austria, 24–28 Apr 2023, EGU23-5967, https://doi.org/10.5194/egusphere-egu23-5967, 2023.