html>

Data Mining

CSC 503/SENG 474, Fall 2023

Lectures: Tuesdays and Fridays 2:30pm - 3:50pm, BWC A104
Instructor: Nishant Mehta
TAs: Yifeng Bie (<first name><last name> [at] uvic.ca),
        Yibo Liu (<last name><first name>97 [at] outlook.com),
        Andrea Nguyen (trang<first letter of last name> [at] uvic.ca)

Labs: Mondays in ECS 242

Nishant's office hours: Tuesdays and Fridays, 4pm - 5pm

Textbooks:


           **Information about the Project**


What this course is about
This course is an introduction to Data Mining/Machine Learning, a sub-field of artificial intelligence that is all about how algorithms can use experience to improve their performance on tasks. This course will introduce you to many foundational machine learning methods and give you both a theoretical grounding as well as ample practical experience in implementing and using these methods on real data.
The objective of this course is to give students a foundation in machine learning, including important problems like classification, regression, clustering, and dimension reduction. The emphasis will be on understanding the design of various machine learning methods, learning how to use them in practice, and learning principled ways to evaluate their performance. The (optional) labs will complement the lecture topics by offering practical experience in experimenting with machine learning methods. The assignments will revolve around implementing machine learning algorithms and analyzing their results on data, with most of the emphasis on the analysis. Assignments might also involve some theoretical component (especially for graduate students).

In the schedule below, any information about future lectures is just a rough guide and might change.

Readings are required unless indicated as optional. The lectures supplement the readings, and to do well in this course (and learn machine learning) you should do the readings and attend the lectures. Some readings are marked as optional. In many cases, this is because they are more advanced; you are always welcome to ask the instructor questions about reading material, either via discussion forum (Ed Discussion; see Brightspace for the signup link) or office hours (or email if needed).

Lectures
Date Topics Lecture Slides/Notes Reading
9/8 Introduction Lecture 1: slides (Mitchell) Chapter 1
(Murphy) Chapter 1 (optional)
9/12 Decision Trees Lecture 2: slides (Mitchell) Chapter 3
9/15 Random Forests Lecture 3: slides Random Forests chapter of ESL (optional) - reading guide
9/19 Evaluation and Model Selection Lecture 4: slides (Mitchell) Chapter 5
9/22 Boosting Lecture 5: slides Boosting book - Chapter 1 (optional)
9/26 Neural Networks I: Intro, Linear separators Lectures 6–8: slides (Mitchell) Chapter 4
(Murphy) Chapter 13 (optional) - reading guide
9/29 Neural Networks II: Perceptron, Gradient descent
10/3 Neural Networks III: Gradient descent, SGD, Sigmoid units,
                                 Multi-layer networks, Backprop, Reducing overfitting
10/6 SVMs: Large margin separation, Soft-margin SVM, learning with kernels Lectures 9–10: slides SVM tutorial - reading guide
Andrew Ng's SVM lecture notes (optional)
10/10 Midterm
10/13 Soft-margin SVM, Learning with kernels
10/17 Probability Review, Maximum Likelihood Estimation Estimating Probabilities: MLE and MAP
(Murphy) Chapter 4 (optional) - reading guide
10/20 MAP Estimation (including MDL) Lecture 12: slides Generative and Discriminative Classifiers:
Naive Bayes and Logistic Regression

(Murphy) Chapters 9 and 10 (optional) - reading guide
(Mitchell) Section 6.6: MDL Principle
10/24 Naive Bayes Lecture 13: slides
10/27 Naive Bayes continued, Logistic Regression
Learning Theory

Lectures 14–15: slides
(Mitchell) Chapter 7 (up to and including Section 7.4.3)
10/31 Learning Theory: PAC Learning, Agnostic Learning, VC dimension
11/3 Clustering I: K-means problem and Hierarchical clustering Lecture 16: slides
11/7 Clustering II: K-means problem and Hierarchical clustering
Instance-based Learning I: k-NN and recommender systems

Lecture 17: slides
(Mitchell) Chapter 8 (Sections 8.1 and 8.2)
11/10 Instance-based Learning II: k-NN and recommender systems
Reading Break
11/17 Gaussian mixture models and EM Lecture 19: slides/notes (from Spring 2022)
Jupyter notebook for EM
(Murphy, 2012) Chapter 11 - reading guide
11/21 Dimension Reduction/Feature Transformation: PCA I Lecture 20: slides
Jupyter notebook for Eigenfaces
Jonathon Shlens's PCA tutorial (Sections I through V)
11/24 Dimension Reduction/Feature Transformation: PCA II
Fairness and Machine Learning
Lecture 21: slides
11/28 Project Presentations (in class)
12/1 Project Presentations (in class)