Probabilistic graphical modeling and inference is a powerful modern approach to representing the combined statistics of data and models, reasoning about the world in the face of uncertainty, and learning about it from data. It cleanly separates the notions of representation, reasoning, and learning. It provides a principled framework for combining multiple sources of information, such as prior knowledge about the world, with evidence about a particular case in observed data. This course will provide a solid introduction to the methodology and associated techniques, and show how they are applied in diverse domains ranging from computer vision to computational biology to computational neuroscience.
The following textbook will be used for reading assignments. An electronic copy is available via the UA library webpage (NetID login required):
D2L: https://d2l.arizona.edu/d2l/home/1135240
Piazza: https://piazza.com/arizona/spring2022/csc535
Instructor: Jason Pacheco, GS 724, Email: pachecoj@cs.arizona.edu Office Hours: Fridays, 3-5pm (Zoom via D2L Calendar) Instructor Homepage: http://www.pachecoj.com
Date | Topic | Readings | Assignment |
---|---|---|---|
1/12 | Introduction + Course Overview (slides) |
W3Schools : Numpy Tutorial YouTube : Numpy Tutorial : Mr. P Solver |
|
1/17 | No Class: MLK Day | ||
1/19 | Probability Primer (Fundumentals and Discrete Probability) (slides) | CH 2.1 - 2.4 | HW1 (Due: 1/26) |
1/24 | Probability Primer (Continued) (slides) | CH 2.1 - 2.4 | |
1/26 | Probability Primer (Continuous Probability) (slides) | CH 2.4-2.7 | HW2 (Due: 2/2) |
1/31 | Probability Primer (Bayesian Probability, Inference) (slides) |
CH 3 YouTube: 3Blue1Brown : Bayes Rule |
|
2/2 | Directed Probabilistic Graphical Models (slides) | CH 10.1 - 10.5 | |
2/7 | Undirected Probabilistic Graphical Models (slides) | CH 19.1 - 19.4 | HW3 (Due: Wed, 2/16) |
2/9 | Undirected Probabilistic Graphical Models (Cont'd) (slides) | ||
2/14 | Message Passing Inference (Sum-Product Belief Propagation) (slides) |
CH 20 Kschischang, et al. "Factor Graphs and the Sum-Product Algorithm |
|
2/16 | Message Passing (Loopy BP, Max-Product BP) (slides) |
CH 20 Kschischang, et al. "Factor Graphs and the Sum-Product Algorithm |
HW4 (Due: Tuesday, 3/1) Example factor-to-variable message (Jupyter Notebook) |
2/21 | (slides)Message Passing (Variable Elimination) (slides) | CH 20 | |
2/23 | Message Passing (Junction Tree) (slides) | CH 20 | |
2/28 | Parameter Learning / Expectation Maximization (slides) | CH 11 | |
3/2 | Midterm Review (slides) | Midterm (Due: 3/4) | |
3/7 | No Class: Spring Recess | ||
3/9 | No Class: Spring Recess | ||
3/14 | Expectation Maximization (Continued) (slides) | CH 11 | HW5 (Due: 3/23) |
3/16 | Dynamical Systems (HMM, Baum-Welch Learning) (slides) | CH 17 | |
3/21 | Dynamical Systems (Linear Dynamical Systems, Kalman Filter) (slides) | CH 18 | |
3/23 | Dynamical Systems (Nonlinear and Switching State-Space) (slides) | CH 18 | HW6 (Due: 4/6) |
3/28 | Monte Carlo Methods (Rejection Sampling, Importance Sampling) (slides) | Sec. 23.1-23.4 | |
3/30 | Monte Carlo Methods (Rejection Sampling, Importance Sampling) (slides) | Sec. 23.1-23.4 | |
4/4 | Monte Carlo Methods (Sequential Monte Carlo) (slides) | Sec. 23.5 | |
4/6 | Monte Carlo Methods (Sequential Monte Carlo) (slides) | Sec. 23.5 | HW7 (Due: 4/20) |
4/11 | Markov Chain Monte Carlo (Metropolis-Hastings) (slides) |
Sec. 24.1-24.4 Neal. "Probabilistic Inference Using MCMC." 1993 Andrieu et al. "An Intro. to MCMC for ML." 2003 |
|
4/13 | Markov Chain Monte Carlo (Metropolis-Hastings) (slides) | Sec. 24.1-24.4 | |
4/18 | Markov Chain Monte Carlo (Gibbs Sampling) (slides) | ||
4/20 | Exponential Family (slides) | Sec. 9.1-9.2 | HW8 (Due: 5/4) |
4/25 | Exponential Family (slides) | Sec. 9.1-9.2 | |
4/27 | Variational Inference (Mean Field) (slides) | Sec. 21.1-21.7 | |
5/2 | Variational Inference (Mean Field) (slides) | Sec. 22.1-22.5 | |
5/4 | Course Wrapup (slides) |