CS 598: Principles of Generative AI

Spring 2024

Course Description

Recent advancements in generative AI have equipped machine learning algorithms with the ability to learn from and accurately replicate observed data, creating new, similar data instances. This course provides an in-depth exploration of the key algorithmic developments in generative models, together with their underlying mathematical principles. We will cover a range of topics such as normalizing flows, variational autoencoders, Langevin algorithms, generative adversarial networks, diffusion models, and sequence generation models, etc.

Basic Information

Lectures

Lecture Topic
1 Introduction
2 Basic Neural Network Models and Optimization
3 Energy-Based Model
4 Variational Inference
5 Encoder Decoder and Auto-Encoder
6 Variational Auto Encoder
7 Normalizing Flow Basics
8 Variational Normalizing Flow and Sampling Basics
9 Markov Chain Monte Carlo
10 Langevin Algorithms and SDE
11 Hamiltonian Monte Cardo and Under-damped Langevin Algorithm
12 Distance, Generation and Convergence
13 Score Matching
14 Diffusion Model (basics)
15 Diffusion model (reverse process)
16 Diffusion model (flow based generation)
17 Diffusion model (practice)
18 GAN (basics)
19 GAN (practice)
20 Neural ODE
21 Disentanglement and representation learning
22 Basic sequence models
23 Transformer
24 Sequence based image generation