MVA, MscAI and DSBA masters, and CentraleSupelec cursus, January-March 2024
News : (reload for fresher news...)
- Comments welcomed! about this class here
- Last session this Monday 25th morning, with guest: Romain Égelé The previous course (January-March 2023) can be found there; this course follows more or less the same lines, though not exactly.
To attend the course, please register here first.
NB: Auditors (auditeurs libres) are welcome; just subscribe as well (no need to ask me by email).
Requirements : having already followed a course about neural networks (this is an advanced deep learning course).
Typical mathematical notions used: differential calculus, Bayesian statistics, analysis, information theory.
Teaching team :
Most lectures: Guillaume Charpiat
Practical sessions: Rémy Hosseinkhan, Cyriaque Rousselot and Antoine Szatkownik (incl. materials by Victor Berger, Alessandro Bucci, Loris Felardos, Wenzhuo Liu, Matthieu Nastorg and Francesco Pezzicoli)
Schedule :This year the course is given twice because of agenda compatibility issues between different masters; you can attend, each week, the session instantiation you wish. Note that the schedules are irregular and that locations vary.
Sessions last 3 hours + a 15 minute break and are held at various places at CentraleSupelec.
MVA master and CentraleSupelec cursus: at 8h15 on:
Session 1 : Thursday 25th of January (h.207 - h.208 "Salle TP Numérique", Bouygues building)
Session 2 : Thursday 1st February (Amphi VI.005, Eiffel building)
Session 3 : Monday 5th of February (Amphi sd.014, Bouygues building)
Session 4 : Monday 12th of February (Amphi VI.005, Eiffel building)
2 week break
Session 5 : Monday 4th of March (Amphi e.093, Bouygues building)
Session 6 : Monday 11th of March (Amphi VI.005, Eiffel building)
Session 7 : Monday 18th of March (Amphi sc.046 (Peugeot), Bouygues building)
Session 8 : Monday 25th of March (Amphi /Centre de Langues/, Eiffel building)
Session 1 : Deep learning vs. classical machine learning and optimization → Practical session: hyperparameters and training basics: instructions and the 2 problems of TP1: Pb 1, and Pb 2 with its dataset. NB: TP1 will not be graded. → TP1 repos, with general MLOps advice: gitlab, github →Link to hand in TP1 (same password as usual) →Environment used in practical sessions →Practical tips and tricks to train neural networks →[2022]Course notes (pdf) (handwritten with drawings) and lesson summary (html) →[2022] Video recording: lecture [270MB]
Session 3 : Interpretability (part 2: biases and issues with datasets) →[2023]Course notes (pdf) (handwritten with drawings) and lesson summary (html) →[2023] Video recording: part 3 [700MB], part 4 [400MB]
Session 4 : Architectures → Practical session on graph-NN: notebook →Link to hand in TP3 (same password as usual) →[2023]Course notes (pdf) part 1 + part 2 (handwritten with drawings) and lesson summary →[2023] Video recording: part 1 [500MB] (theory: prior, initialization, ...) and part 2 [500MB] (architecture zoo, attention, graph-NN)
Session 6 : Modeling: deep learning and physics (exploiting known invariances, priors or physical properties) → Practical session on learning dynamical systems: notebook + image (included by the notebook) →Link to hand in TP5 →[2023]Course notes (pdf) part 1 + part 2 (handwritten with drawings) and lesson summary →[2023] Video recording: part 1 [850MB] and part 2 [500MB]
Designing lighter architectures for generative models based on diffusion flows (same application as previous topic, but focusing on the generative architecture in general)