Information Theory course


News : (reload for fresher news...)
- Website updated for this new year 2024 (subscribe to the mailing-list!)




General information:
Schedule: 7 sessions of (lesson for 2 hours, practical session for 1 hour), on Friday mornings 9h-12h, starting on March 8th.
Grading: based on the weekly exercises (to be handed in for the next session)

Teachers:
General goal of the course: to provide information theory tools for machine learning.

Short summary:
We will introduce the concept of entropy, leading to a distance measure between distributions (Kullback-Leibler divergence). We will then study the equivalence between compression, prediction and generation. In a third part, we will formalize the preference for simpler models through Kolmogorov complexity. Last, we will get a glimpse of information geometry (Fisher metric).
The following part of this page gives you a hint about the kind of topics the course will cover, but changes might happen.
Materials from last years and not updated yet are preceded by "[2021]" e.g..


Most chapters will cover two lessons

(click on 'details' to show/hide summary, exercises, references, etc.)

Chapter 1 : Entropy

Chapter 2 : Compression/Prediction/Generation equivalence

Chapter 3 : Fisher information

Chapter 4 : Kolmogorov complexity




Back to main page

Valid HTML 4.0 Transitional