The aim of this course is to explore advanced techniques in probabilistic graphical models (PGMs) and statistical machine learning (ML) more broadly. Students will develop the ability to apply these techniques to their own research. Students will learn to perform statistical inference and reasoning in complex probabilistic statistical models. The course will survey state-of-the-art ML research including: variational inference, Bayesian Deep Learning, representation learning, and uncertainty quantification. Upon conclusion of this course students will be capable of developing new methods and advancing the state-of-the-art in ML and PGM research.
D2L: https://d2l.arizona.edu/d2l/home/1205997
Piazza: https://piazza.com/arizona/fall2022/csc696h1
Instructor: Jason Pacheco, GS 724, Email: pachecoj@cs.arizona.edu Office Hours: Mondays 3:30-4:30pm, Fridays (Zoom) 3:30-4:30pm Instructor Homepage: http://www.pachecoj.com
Date | Topic | Readings | Presenter / Slides |
---|---|---|---|
1/10 | Introduction + Course Overview | (slides) | |
1/15 | Martin Luther King Jr Day : No Classes | ||
1/17 | Probability and Statistics : Probability Theory |
PRML
: Sec. 1.2.1-1.2.4 |
(slides) |
1/22 | Probability and Statistics : Bayesian Statistics |
Why Isn't Everyone a Bayesian? Efron, B. 1986 Objections to Bayesian Statistics Gelman, A. 2008 |
(slides) |
1/24 | Probability and Statistics : Bayesian Statistics (Cont'd) | ||
1/29 | Inference : Monte Carlo Methods |
Introduction to Monte Carlo Methods MacKay, D. J. C . Learning in Graphical Models. Springer, 1998 |
(slides) |
1/31 | Inference : Monte Carlo Methods (Cont'd) | ||
2/5 | Inference : Variational Inference |
Variational Inference: A Review for Statisticians Blei, D., et al., J. Am. Stat. Assoc. 2017 Optional: PRML : Sec. 10.1-10.4 |
(slides) |
2/7 | Inference: Approximate Bayesian Computation |
Approximate Bayesian Computation (ABC) Sunnaker, M. et al. PLoS Computational Biology, 2013 |
James (slides) |
2/12 | Inference: Bayesian Conditional Density Estimation |
Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation Papamakarios, G. and Murray, I. NeurIPS, 2016 |
Varun (slides) |
2/14 | Bayesian Deep Learning: Introduction |
Weight Uncertainty in Neural Networks Blundel, C. et al. ICML, 2015 |
Cameron (slides) |
2/19 | Bayesian Deep Learning: Monte Carlo Dropout |
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning Gal, Y. and Ghahramani, Z. ICML, 2016 |
Miki (slides) |
2/21 | Bayesian Deep Learning: Variational Dropout |
Variational Dropout and the Local Reparameterization Trick Kingma, D. P. et al. NeurIPS, 2015 |
Brenda (slides) |
2/26 | Bayesian Deep Learning: Information Bottleneck |
Deep Variational Information Bottleneck Alemi, A. A. et al. ICLR, 2016 |
Projects Info Kayla (slides) |
2/28 | Bayesian Deep Learning: Representation Learning |
InfoGAN: Interpretable Representation Learning by Information Maximizing Generative Adversarial Nets Chen, X. et al. NeurIPS 2016 |
Daniel (slides) |
3/4 | Spring Recess : No Classes | ||
3/6 | Spring Recess : No Classes | ||
3/11 | Bayesian Deep Learning : Representation Learning |
Information Dropout: Learning Optimal Representations Through Noisy Computation Achille, A. and Soatto, S. PAMI, 2018 |
Thang (slides) |
3/13 | Generative Models : Variational Autoencoder |
Auto-encoding Variational Bayes Kingma and Welling, ArXiv, 2014 Optional Reference: Kingma, D. P. and Welling, M. ArXiv, 2019 |
Natnael (slides) |
3/18 | Generative Models : Diffusion Probabilistic Models |
Denoising Diffusion Probabilistic Models Ho et al., NeurIPS, 2020 |
Varun (slides) |
3/20 | Generative Models : Diffusion Implicit Models |
Denoising Diffusion Implicit Models Song et al., ICLR, 2021 |
Kayla (slides) |
3/25 | Generative Models : Score-Based Generative Modeling |
Score-Based Generative Modeling Through Stochastic Differential Equations Song et al., ICLR, 2021 |
Natnael (slides) |
3/27 | Generative Models : Energy-Based Models |
Implicit Generation Modeling with Energy-Based Models Du and Mordatch, NeurIPS, 2019 |
Brenda (slides) |
4/1 | Generative Models : Energy-Based Models |
How to Train Your Energy-Based Models Song and Kingma, ArXiv, 2021 |
Daniel (slides) |
4/3 | Uncertainty Quantification : Variational BOED |
Variational Bayesian Optimal Experimental Design Foster et al., NeurIPS, 2019 |
Miki (slides) |
4/8 | Uncertainty Quantification : Variational MI Bounds |
On Variational Bounds of Mutual Information Poole et al., ICML, 2019 |
Alonso (slides) |
4/10 | Uncertainty Quantification : MINE |
Mutual Information Neural Estimation Belghazi et al., ICML, 2018 |
Cameron (slides) |
4/15 | Uncertainty Quantification : DAD |
Deep Adaptive Design: Amortizing Sequential BOED Foster et al., ICML, 2021 |
James (slides) |
4/17 | Uncertainty Quantification : Contrastive Predictive Coding |
Representation Learning with Contrastive Predictive Coding van den Oord et al., arXiv, 2018 |
Thang (slides) |
4/22 | Uncertainty Quantification : Bayesian Experimental Design |
Modern Bayesian Experimental Design Rainforth et al., Statistical Science, 2024 |
Alonso (slides) |
4/24 | Project Presentations | Alonso, Daniel, Thang | |
4/29 | Project Presentations | James, Cameron, Nate | |
5/1 | Project Presentations | Kayla, Brenda, Varun |