Students will learn why machine learning is a fundamentally different way of writing computer programs, and why this approach is often a uniquely attractive way of solving practical problems. Machine learning is all about automatic ways for computers to find patterns in datasets; students will learn both advantages and unique risks that this approach offers. They will learn the fundamental computational methods, algorithms, and perspectives which underlie current machine learning methods, and how to derive and implement many of them. Students will learn the fundamentals of unsupervised and supervised machine learning methods, the computational and quality trade-offs between different methods, and how to adapt existing methods to fit their own research needs.
We will use the following textbook, which is freely available online:
Daumé, Hal. "A Course in Machine Learning." 2017.
The full textbook can also
be downloaded as a
PDF if you prefer that format. All reading assignments will be posted in the schedule below.
Instructor: Jason Pacheco, GS 724, Email: pachecoj@cs.arizona.edu
Jason's Office Hours: Fridays @ 3-5:00pm (15:00-17:00) - via Zoom
Yinan's Office Hours: Fridays @ 10:30am - 12:30pm (1030 - 1230) - via Zoom
D2L 580: https://d2l.arizona.edu/d2l/home/1355965
D2L 480: https://d2l.arizona.edu/d2l/home/1355962
Piazza: https://piazza.com/arizona/fall2023/csc480580/home
Instructor Homepage: http://www.pachecoj.com
Date | Topic | Readings | Assignment |
---|---|---|---|
8/22 | Introduction + Course Overview (slides) |
W3Schools : Numpy Tutorial YouTube : Numpy Tutorial : Mr. P Solver |
HW0: Calibration (Due: 8/29 @ 12pm noon) |
8/24 | Basics - Decision Trees, Learning Algorithms (slides) | CH 1 - Decision Trees | |
8/29 | Limits - Optimal Bayes Rate Classifier, Overfitting / Underfitting (slides) | CH 2 - Limits of Learning | |
8/31 | Geometry - Nearest Neighbor Classifiers, K-Means Clustering (slides) | CH 3 - Geometry and Nearest Neighbors | |
9/5 | The Perceptron Algorithm (slides) | CH 4 - The Perceptron (Prof. Surdeanu slides) | HW1 (Due: 9/15) |
9/7 | Practical Issues - Performance measures, overfitting / underfitting, Cross-Validation (slides) | CH 5 - Practical Issues | |
9/12 | Practical Issues (continued) - Prediction Confidence, Statistical Tests, Bootstrap (slides) | CH 5 - Practical Issues | |
9/14 | Bias-Variance Decomposition and Friends (slides) | ||
9/19 | Linear Models: Linear Regression (slides) | CH 7 - Linear Models | |
9/21 | Linear Models: Logistic Regression (slides) | ||
9/26 | Nonlinear Models: Basis Functions, Kernels, SVM (slides) | ||
9/28 | Probability, Naive Bayes, Graphical Models (slides) | CH 9 - Probabilistic Modeling | HW2 (Due: 10/8) |
10/3 | Probability, Naive Bayes, Graphical Models (continued) (slides) | ||
10/5 | Probability, Naive Bayes, Graphical Models (continued) (slides) | ||
10/10 | Midterm Review (slides) | ||
10/12 | Midterm Exam | Midterm Exam | |
10/17 | Bias and Fairness (slides) | CH8 - Bias and Fairness | |
10/19 | Unsupervised Learning (slides) | CH15 - Unsupervised Learning | |
10/24 | Unsupervised Learning (Cont'd) (slides) | ||
10/26 | Gaussian Mixture Models and Expectation Maximization (slides) | ||
10/31 | Neural Networks and Backpropagation (slides) | CH 10 - Neural Networks | |
11/2 | Neural Networks and Backpropagation (continued) (slides) | ||
11/7 | Ensemble Methods (slides) |
CH 13 - Ensemble Methods Note: We skipped CH 12! |
|
11/9 | Reinforcement Learning (slides) | ||
11/14 | Reinforcement Learning (Cont'd) (slides) | ||
11/16 | Reinforcement Learning (Cont'd) (slides) | ||
11/21 | Data Visualization and Summarization (slides) | ||
11/23 | Thanksgiving Break - No Class | ||
11/28 | Data Visualization and Summarization (slides) | ||
11/30 | Final Exam Review (slides) | ||
12/5 | Course Wrap-up (slides) |