SC4.6 | Introduction to Information Theory


Since Claude Shannon coined the term 'Information Entropy' in 1948, Information Theory has become a central language and framework for the information age. Across disciplines, it can be used for i) characterizing systems, ii) quantifying the information content in data and theory, iii) evaluating how well models can learn from data, and iv) measuring how well models do in prediction. Due to their generality, concepts and measures from Information Theory can be applied to both knowledge- and data-based modelling approaches, and combinations thereof, which makes them very useful in the context of Machine Learning and hybrid modeling.
In this short course, we will introduce the key concepts and measures of Information Theory (Information, Entropy, Conditional Entropy, Mutual Information, Cross Entropy and Kullback-Leibler divergence), with practical examples of how they have been applied in Earth Science, and give a brief introduction to available open-source software.
This course assumes no previous knowledge or experience with Information Theory and welcomes all who are intrigued to learn more about this powerful theory.

Co-organized by AS6/HS11/NP9
Convener: Uwe Ehret | Co-convener: Stephanie ThiesenECSECS
Fri, 28 Apr, 08:30–10:15 (CEST)
Room -2.85/86
Fri, 08:30

Session assets

The oral presentations are given in a hybrid format supported by a Zoom meeting featuring on-site and virtual presentations. The button to access the Zoom meeting appears just before the time block starts.