Since Claude Shannon coined the term 'Information Entropy' in 1948, Information Theory has become a central language and framework for the information age. Across disciplines, it can be used for i) characterizing systems, ii) quantifying the information content in data and theory, iii) evaluating how well models can learn from data, and iv) measuring how well models do in prediction. Due to their generality, concepts and measures from Information Theory can be applied to both knowledge- and data-based modelling approaches, and combinations thereof, which makes them very useful in the context of Machine Learning and hybrid modeling.
In this short course, we will introduce the key concepts and measures of Information Theory (Information, Entropy, Conditional Entropy, Mutual Information, Cross Entropy and Kullback-Leibler divergence), with practical examples of how they have been applied in Earth Science, and give a brief introduction to available open-source software.
This course assumes no previous knowledge or experience with Information Theory and welcomes all who are intrigued to learn more about this powerful theory.
Please decide on your access
Please use the buttons below to download the supplementary material or to visit the external website where the presentation is linked. Regarding the external link, please note that Copernicus Meetings cannot accept any liability for the content and the website you will visit.
You are going to open an external link to the asset as indicated by the session. Copernicus Meetings cannot accept any liability for the content and the website you will visit.