Schedule: 6 sessions of (lesson for 2.5 hours, practical session for 1 hour), on Friday mornings 9h-12h30, starting on March 7th. Grading: based on the weekly exercises (to be handed in for the next session)
Teachers:
Guillaume Charpiat (lectures)
Stella Douka (practical sessions)
General goal of the course:
to provide information theory tools for machine learning.
Short summary: We will introduce the concept of entropy, leading to a distance measure between distributions (Kullback-Leibler divergence). We will then study the equivalence between compression, prediction and generation. In a third part, we will formalize the preference for simpler models through Kolmogorov complexity. Last, we will get a glimpse of information geometry (Fisher metric).
The following part of this page gives you a hint about the kind of topics the course will cover, but changes might happen.
Materials from last years and not updated yet are preceded by "[2021]" e.g..
Most chapters will cover two lessons
(click on 'details' to show/hide summary, exercises, references, etc.)
Chapter 1 : Entropy