Skip to content

kyg0910/-STT997-Deep-Generative-Model

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

97 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Deep Generative Model: A Statistical Perspective

Caption

Instructor Information

Instructor: Younggeun Kim
Email: kimyo145 at msu dot edu

Course Overview

This course, STT 997: Advanced Topics in Statistics (Spring 2025) at Michigan State University, provides a comprehensive understanding of generative models and machine learning methods used to learn and synthesize complex, large-scale data. It aims to enhance the ability to implement these models across various applications, including temporal, multi-modal, and medical data scenarios. Topics covered include latent variable models, statistical distances, and model classes that approximate data distributions.

Objectives

  • Understand and implement generative models.
  • Explore statistical principles and transitions in generative model literature.
  • Gain hands-on experience with popular algorithms in Python and PyTorch.

Lecture Topics

Topic Course Material Key Reference
1. Introduction [Lecture Note]
2. Preliminary Knowledge I: Statistics [Lecture Note] [1, 2]
3. Preliminary Knowledge II: Statistical Learning [Lecture Note] [3]
4. Preliminary Knowledge III: Python and PyTorch [Lecture Note] [Code]
5. Linear Method and Auto-regressive Model [Lecture Note] [4, 5]
6. Energy-based Model [Lecture Note] [6]
7. Variational Autoencoders [Lecture Note] [7]
8. Generative Adversarial Networks [Lecture Note] [8]
9. PyTorch Implementation [Lecture Note] [Code] [9]
10. Optimal Transport-based Method [Lecture Note] [10, 11]
11. Score-based Method [Lecture Note] [12, 13]

References

[1] Bickel, P. J. and Doksum, K. A. (2015). Mathematical statistics: basic ideas and selected topics, volumes I-II package. Chapman and Hall/CRC.
[2] Durrett, R. (2019). Probability: theory and examples, volume 49. Cambridge university press.
[3] Hastie, T. (2009). The elements of statistical learning: data mining, inference, and prediction.
[4] Hyvärinen, A. and Oja, E. (2000). Independent component analysis: algorithms and applications. Neural networks, 13(4-5):411–430.
[5] Uria, B., Côté, M.-A., Gregor, K., Murray, I., and Larochelle, H. (2016). Neural autoregressive distribution estimation. Journal of Machine Learning Research, 17(205):1–37.
[6] Hinton, G. E. (2002). Training products of experts by minimizing contrastive divergence. Neural computation, 14(8):1771–1800.
[7] Kingma, D. P., Welling, M., et al. (2019). An introduction to variational autoencoders. Foundations and Trends® in Machine Learning, 12(4):307–392.
[8] Goodfellow, I. (2016). Nips 2016 tutorial: Generative adversarial networks. arXiv preprint arXiv:1701.00160.
[9] https://pytorch.org/tutorials/beginner/basics/intro.html
[10] Santambrogio, F. (2015). Optimal transport for applied mathematicians. Birkäuser, NY, 55(58-63):94.
[11] Arjovsky, M., Chintala, S., and Bottou, L. (2017). Wasserstein generative adversarial networks. In International conference on machine learning, pages 214–223. PMLR.
[12] Hyvärinen, A. (2005). Estimation of non-normalized statistical models by score matching. Journal of Machine Learning Research, 6(4).
[13] Ho, J., Jain, A., and Abbeel, P. (2020). Denoising diffusion probabilistic models. Advances in neural information processing systems, 33:6840–6851.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published