Date |
Topics |
Lecture Slides/Notes |
Reading |
|
|
|
|
9/8 |
Introduction |
Lecture 1: slides
|
(Mitchell) Chapter 1
(Murphy) Chapter 1 (optional)
|
9/12 |
Decision Trees |
Lecture 2: slides
|
(Mitchell) Chapter 3
|
9/15 |
Random Forests
|
Lecture 3: slides
|
Random Forests chapter of ESL (optional) - reading guide
|
9/19 |
Evaluation and Model Selection
|
Lecture 4: slides
|
(Mitchell) Chapter 5
|
9/22 |
Boosting
|
Lecture 5: slides
|
Boosting book - Chapter 1 (optional)
|
9/26 |
Neural Networks I: Intro, Linear separators |
Lectures 6–8: slides
|
(Mitchell) Chapter 4
(Murphy) Chapter 13 (optional) - reading guide
|
9/29 |
Neural Networks II: Perceptron, Gradient descent |
|
|
10/3 |
Neural Networks III: Gradient descent, SGD, Sigmoid units, Multi-layer networks, Backprop, Reducing overfitting |
|
|
10/6 |
SVMs: Large margin separation, Soft-margin SVM, learning with kernels
|
Lectures 9–10: slides
|
SVM tutorial - reading guide
Andrew Ng's SVM lecture notes (optional)
|
10/10 |
Midterm |
|
|
10/13 |
Soft-margin SVM, Learning with kernels |
|
|
10/17 |
Probability Review, Maximum Likelihood Estimation |
|
Estimating Probabilities: MLE and MAP
(Murphy) Chapter 4 (optional) - reading guide
|
10/20 |
MAP Estimation (including MDL) |
Lecture 12: slides
|
Generative and Discriminative Classifiers: Naive Bayes and Logistic Regression
(Murphy) Chapters 9 and 10 (optional) - reading guide
(Mitchell) Section 6.6: MDL Principle
|
10/24 |
Naive Bayes |
Lecture 13: slides
|
|
10/27 |
Naive Bayes continued, Logistic Regression
Learning Theory
|
Lectures 14–15: slides
|
(Mitchell) Chapter 7 (up to and including Section 7.4.3)
|
10/31 |
Learning Theory: PAC Learning, Agnostic Learning, VC dimension |
|
|
11/3 |
Clustering I: K-means problem and Hierarchical clustering |
Lecture 16: slides
|
|
11/7 |
Clustering II: K-means problem and Hierarchical clustering
Instance-based Learning I: k-NN and recommender systems |
Lecture 17: slides
|
(Mitchell) Chapter 8 (Sections 8.1 and 8.2)
|
11/10 |
Instance-based Learning II: k-NN and recommender systems
|
|
|
|
Reading Break |
|
|
11/17 |
Gaussian mixture models and EM
|
Lecture 19: slides/notes (from Spring 2022)
Jupyter notebook for EM
|
(Murphy, 2012) Chapter 11 - reading guide
|
11/21 |
Dimension Reduction/Feature Transformation: PCA I
|
Lecture 20:
slides
Jupyter notebook for Eigenfaces
|
Jonathon Shlens's PCA tutorial (Sections I through V)
|
11/24 |
Dimension Reduction/Feature Transformation: PCA II
Fairness and Machine Learning |
Lecture 21:
slides
|
|
11/28 |
Project Presentations (in class) |
|
|
12/1 |
Project Presentations (in class) |
|
|