formerly known as Deep Learning in Practice (but there is as much theory as practice)
MVA and MscAI masters, and CentraleSupelec cursus, January-March 2026
News : (reload for fresher news...)
- TP2 is online! -- sorry for the delay
- presentation of TP2 added
- session 2 cancelled (cf below)
- sessions 1's notes and recordings added
- project guidelines added
- TP1 added
- good practice guidelines added
- First course: 9h45 in théâtre Rousseau e.070, Bouygues building, Thursday 22nd of January
- Location updated
- 2026 schedule updated
- Registration for the 2026 course: click here! The previous course (January-March 2025) can be found there; the 2026 course will follow more or less the same lines, though not exactly.
To attend the 2026 course (that will start in January 2026), please register here first.
NB: Auditors (auditeurs libres) are welcome; just subscribe as well (no need to ask me by email).
To just access the recordings of previous courses (without attending the course), register here instead.
Requirements : having already followed a course about neural networks (this is an advanced deep learning course).
Typical mathematical notions used: differential calculus, Bayesian statistics, analysis, information theory.
Teaching team :
Most lectures: Guillaume Charpiat
Practical sessions / project supervision: Félix Houdouin, Theofanis Ifaistos, Théo Rudkiewicz, Jules Soria and Luca Teodorescu (incl. materials by numerous previous great teachers: Victor Berger, Alessandro Bucci, Styliani Douka, Loris Felardos, Rémy Hosseinkhan, Wenzhuo Liu, Matthieu Nastorg, Francesco Pezzicoli, Cyriaque Rousselot and Antoine Szatkownik)
Course validation : we will let the choice between 2 possibilities :
either by practicals:
5 practicals (notebooks), presented in class, to do at home in groups of 1-3 people, and to hand in within 2 weeks
project including several concepts from the course, to do in groups of 1-3 people, with small report to write + defense
1 final exam (on paper, in class, alone)
Schedule :Note that the schedules are irregular and that locations vary.
Sessions take place on Thursdays, 9h45 - 13h (3 hours course + a 15 minute break), at various places at CentraleSupelec and ENS Paris-Saclay:
Session 1 : Thursday January 22nd (théâtre Rousseau, Bouygues building -- known as room e.070)
Session 2 : Thursday January 29th (amphi Hodgkin, ENS Paris-Saclay -- known as 0i10) session cancelled: watch the recordings online and come ask questions and discover the TP at 11h (the visio link will be sent by email + on the Discord forum)
Session 3 : Thursday February 5th (amphi Hodgkin, ENS)
Session 4 : Thursday February 12th (amphi Hodgkin, ENS)
3 week break
Session 5 : Thursday March 5th (amphi Hodgkin, ENS)
Session 6 : Thursday March 12th (amphi Hodgkin, ENS)
Session 7 : Thursday March 19th (amphi Hodgkin, ENS)
Session 8 : Thursday March 26th (amphi Hodgkin, ENS) - Final exam
Location info: bus stop "Moulon" + 4 minute walk
to the Bouygues building
or to the ENS Paris-Saclay building (amphi Hodgkin, numbered 0i10, ground floor, west building, accessible from the garden).
Common transportation planification: Île de France mobilités.
Program : (subject to a few potential modifications)
Session 1 : Deep learning vs. classical machine learning and optimization → Practical session: hyperparameters and training basics (PyTorch refresher): instructions and the 2 problems of TP1: Pb 1, and Pb 2 with its dataset. NB: TP1 will not be graded. → Good practice: github repo with general MLOps advice (code repo management with git, logging seeds and arguments of experiments, job scheduler to use computing clusters...) →Practical tips and tricks to train neural networks →Environment used in practical sessions →[2026]Course notes (pdf) (handwritten with drawings) and lesson summary (html) →[2026] Video recording: part 1 [140MB] and part 2 [75MB]
Session 3 : Architectures →[2023]Course notes (pdf) part 1 + part 2 (handwritten with drawings) →[2025]lesson summary →[2023] Video recording: part 1 [500MB] (theory: prior, initialization, ...) and part 2 [500MB] (architecture zoo, attention, graph-NN)
Session 4 : Issues with datasets (biases, privacy...) →[2023]Course notes (pdf) (handwritten with drawings) and lesson summary (html) →[2023] Video recording: part 3 [700MB], part 4 [400MB]
Session 5 : Small data and frugal AI: weak supervision, transfer, compression and incorporation of priors →[2023]Course notes (pdf) (handwritten with drawings) and lesson summary →[2023] Video recording: part 1 [500MB] and part 2 [500MB]
Guaranteeing the absence of privacy leakage of generators trained on sensitive data, using an Extreme Value Theory framework on distances to closest neighbors, with Cyril Furtlehner (see arxiv 2510.24233)
Links between explainability, frugal AI, robustness and formal proofs of neural networks: looking for statistically-meaningful concepts and enhancing them. In collaboration with PEPR IA SAIF.
and lots of other very interesting topics, on demand (deep learning to speed up fluid mechanics simulations, for dynamical systems, for physics; learning causality, ...)