User Tools

Site Tools


curriculum

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
curriculum [2025/07/24 09:20] respai-viccurriculum [2025/07/31 16:35] (current) – [Period 1] respai-vic
Line 4: Line 4:
  
 Mandatory: Mandatory:
-  * APM_51438_EP: Refresher in Statistics (Marine Le Morvan)+  * Refresher in Statistics (APM_51438_EP, Marine Le Morvan, Inria)
  
 1 course among: 1 course among:
-  * CSC_51438_EP: Refresher in Computer Graphics (Marie-Paule Cani), presenting an introduction to 3D Computer Graphics  +  * Refresher in Computer Graphics (CSC_51438_EP, Marie-Paule Cani, EP), presenting an introduction to 3D Computer Graphics  
-  * CSC_51440_EP: Refresher in Computer Science (Amal Dev Parakkat)+  * Refresher in Computer Science (CSC_51440_EP, Amal Dev Parakkat, Telecom Paris)
  
  
Line 22: Line 22:
  
 1 course among: 1 course among:
-  * Deep Learning (CSC_51054_EP, Michalis Vazirgiannis, EP) +  * Deep Learning (Recommeded option, CSC_51054_EP, Michalis Vazirgiannis, EP)  
 +  * Digital Representation and Analysis of Shapes (Recommeded option, CSC_51074_EP, Mathieu Desbru and Pooran Memari, EP & Inria) 
 +  * Signal Processing (Recommeded option, APM_51055_EP, Rémi Flamary, EP)
   * Topological Data Analysis (CSC_51056_EP, Steve Oudot, EP & Inria)   * Topological Data Analysis (CSC_51056_EP, Steve Oudot, EP & Inria)
-  * Digital Representation and Analysis of Shapes (CSC_51074_EP, Mathieu Desbru and Pooran Memari, EP & Inria) 
-  * Signal Processing (APM_51055_EP, Rémi Flamary, EP) 
  
 + Mandatory non-scientific courses: + Mandatory non-scientific courses:
Line 41: Line 41:
  
 2 courses among: 2 courses among:
-  * Deep Learning (APM_52183_EP, Kevin Scaman)+  * Deep Learning (Recommended option, APM_52183_EP, Kevin Scaman
 +  * Image Synthesis: Theory and Practice (Recommended option, CSC_52084_EP, Tamy Boubekeur, Telecom ParisTech) 
 +  * Statistics in Action (Recommended option, APM_52066_EP, Zacharie Naulet, EP & INRAE)
   * Advanced Deep Learning (CSC_52087_EP, Michalis Vazirgiannis, Vicky Kalogeiton, Johannes Lutzeyer, EP)    * Advanced Deep Learning (CSC_52087_EP, Michalis Vazirgiannis, Vicky Kalogeiton, Johannes Lutzeyer, EP) 
-  * Image Synthesis: Theory and Practice (CSC_52084_EP, Tamy Boubekeur, Telecom ParisTech) 
   * Graph Representation Learning (CSC_52072_EP, Johannes Lutzeyer and Michalis Vazigiannis, EP)   * Graph Representation Learning (CSC_52072_EP, Johannes Lutzeyer and Michalis Vazigiannis, EP)
-  * Statistics in Action (APM_52066_EP, Zacharie Naulet, EP & INRAE) 
   * Emerging Topics in Machine Learning (APM_52188_EP, Rémi Flamary, EP)   * Emerging Topics in Machine Learning (APM_52188_EP, Rémi Flamary, EP)
  
Line 70: Line 70:
  
 **APM_53440_EP - Advanced unsupervised learning (24h, 2 ECTS), Pierre Latouche(UCA)** (contact: pierre.latouche@uca.fr)  **APM_53440_EP - Advanced unsupervised learning (24h, 2 ECTS), Pierre Latouche(UCA)** (contact: pierre.latouche@uca.fr) 
-> ABSTRACT TO BE ADDED+ 
 +This course aims at presenting advanced models and methods from computational statistics and machine learning for unsupervised learning, through the context of directed graphical models. Both the frequentist and Bayesian frameworks will be covered. In particular, we will study clustering methods and we will focus on mixture models and on the expectation maximisation algorithm for inference. The standard model selection criterion will be derived and we will illustrate their use to estimate the model complexity. After having given the theory of directed graphical models, we will show on a series of examples how the classical models from machine learning can be characterized in such a context. We will show how all the standard loss functions in machine learning can be linked to specific directed graphical models, giving rise to new ways to think about problems and to solve them. We will particularly insist on the interest of relying on directed graphical models to evaluate the complexity of the inference task for various models. The variational framework will be presented along with the variational EM and variational Bayes EM algorithms. The inference task of the stochastic block model, with a two-to-one dependency, will be described. The second part of the course will be about the use of deep neural networks in directed graphical models to obtain the so-called deep graphical models. Variational autoencoders and GAN we will be presented in such a context along with the corresponding inference strategies. We will finally show on a given example how the main model for social network analysis can be rewritten and extented with deep graphical models. We will point out the modifications in terms of directed graphical models, inference, and applications. In particular, we will describe the use of graph neural networks. Lectures will be done through slides and proofs on the blackboard. In labs, we will implement and test all the approaches seen in the lectures.  
 + 
 + 
  
          
-**APM_53441_EP - Learning with tabular data (24h, 2 ECTS), Marine Le Morvan** (contact: marine.le-morvan@polytechnique.edu)  +**APM_53441_EP - From Boosting to Foundation Models: learning with Tabular Data (24h, 2 ECTS), Marine Le Morvan** (contact: marine.le-morvan@polytechnique.edu)  
-ABSTRACT TO BE ADDED+This course offers a modern overview of statistical learning with tabular data, from classical tree-based models to emerging deep and foundation models. We will review the foundations of gradient boosting methods and their scalable implementations, cover recent deep learning models tailored for tabular data, and introduce tabular foundation models. In particular, we will discuss the limitations of LLMs on structured data, introduce the concept of in-context learning, and provide an in-depth understanding of novel tabular foundation models, including their architecture and pretraining strategies. The course will also address key practical challenges in real-world datasets and applications, such as encoding heterogeneous feature types (categorical, numerical, temporal, textual), strategies for handling missing data, and methods for evaluating and calibrating predictive uncertainty.
  
  
Line 129: Line 133:
 ==== Period 3 ==== ==== Period 3 ====
  
-MAP/INF690 - Internship either in the R&D department of a company or in a research lab  (5 to 6 months, 24 ECTS). +INT_54490_EP - Internship either in the R&D department of a company or in a research lab  (5 to 6 months, 24 ECTS). 
  
  
curriculum.1753348820.txt.gz · Last modified: 2025/07/24 09:20 by respai-vic

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki