Introduction to Flow Matching and Diffusion Models 2026

MIT Course 6.S184: Generative AI with Stochastic Differential Equations

Diffusion and flow models are the cutting edge generative AI methods for images, videos, and many other data types. This course offers a comprehensive introduction for students and researchers seeking a deeper understanding of these models. Lectures will teach the core mathematical concepts necessary to understand diffusion models, including stochastic differential equations and the Fokker-Planck equation, and will provide a step-by-step explanation of the components of each model. Labs will accompany each lecture allowing students to gain practical, hands-on experience with the concepts learned in a guided manner. At the end of the class, students will have built a latent diffusion model from scratch – and along the way, will have gained hands-on experience with the mathematical toolbox of stochastic analysis that is useful in many other fields. This course is ideal for those who want to explore the frontiers of generative AI through a mix of theory and practice. We recommend some prior experience with probability theory and deep learning.

Course Notes

The course notes serve as the backbone of the course and provide a self-contained explanation of all material in the class. We strongly recommend using them. You can view the notes by clicking the button below:

View the course notes here!

To cite these lecture notes, please use:

@misc{flowsanddiffusions2026,
  author       = {Peter Holderrieth and Ezra Erives},
  title        = {Introduction to Flow Matching and Diffusion Models},
  year         = {2026},
  url          = {https://diffusion.csail.mit.edu/},
  eprint       = {2506.02070},
  archivePrefix = {arXiv}
}
  

Lectures

Lecture Topic Slides Recording
1 Flow and Diffusion Models
  • Introduction to generative models
  • Ordinary and stochastic differential equations
  • Sampling from flow and diffusion models
[slides 1]
2 Flow Matching
  • Conditional and marginal probability path
  • Conditional and marginal vector field
  • Flow matching training objective
[slides 2]
3-A Score Functions and Score Matching
  • Score functions
  • Denoising score matching
  • SDE sampling
[slides 3]
3-B Classifier-free Guidance
  • Guided generation
  • Classifier guidance
  • Classifier-free guidance
[slides 3]
4 Latent Spaces and Neural Network Architectures
  • Variational autoencoders and latent spaces
  • Diffusion Transformer and U-Nets
  • Case studies: Large-scale models
[slides 4]
5 Discrete Diffusion Models
  • Continuous-time Markov chains (CTMCs)
  • Sampling from CTMC models
  • Training CTMC models
[slides 5]

Labs

There are 3 labs given as exercises accompanying the class to give you hands-on practical experience. The labs will guide you through building a flow matching and diffusion model from scratch step-by-step. To do the exercises, perform the following steps:
  1. Click on the lab link below to view the lab instructions.
  2. Download the .ipynb notebook from GitHub and open it in your favorite Jupyter notebook environment (one good choice is Google Colab).
  3. Follow the instructions in the lab to complete the exercises.
  4. Once all questions have been completed, export your notebook to a PDF and submit to Gradescope via Canvas. Please do not clear cell output, as this makes it harder to grade!

Lab 1: Working with ODEs and SDEs

Open in Colab

Lab 2: Flow Matching and Score Matching

Open in Colab

Lab 3: Diffusion Transformer and VAEs

Open in Colab

Stuck? Solutions can be found here.

Instructors

Lectures

Instructor Photo Peter Holderrieth

PhD Student

Labs

Instructor Photo Ron Shprints

MEng Student

Instructor Photo Ezra Erives

D. E. Shaw Research

Prerequisites: Linear algebra, multivariate calculus, and basic probability theory. Students should be familiar with Python and have some experience with PyTorch.

Acknowledgements

We would like to thank:

Source code.

Licensed under CC BY-NC-SA.