Time & Location: Lecture: MW4:30–5:45 pm, in person at 1229 EB2 with real-time online delivery via Zoom. Discussion section (optional): Fri 1:00–1:50 pm delivered via Zoom.
Instructor: Dr. Chau-Wai Wong, chauwai.wong [ ] ncsu [ ] edu
Office hour: Tu4:30–5:30 pm on Zoom
Teaching Assistant and Co-Instructor: Ms. Jisoo Choi, jchoi23 [ ] ncsu [ ] edu
Office hour: Fri 1:50–2:50 pm on Zoom
Course Description: Learning from experience is one of the hallmarks of intelligence. Machine learning is the study of computer algorithms that improve automatically through experience. Machine learning, a subfield of artificial intelligence (AI), has achieved remarkable progress over the past decade, especially in deep learning. This course introduces fundamental concepts and algorithms that are vital for understanding state-of-the-art and cutting-edge development toward the next wave of AI. This course also exposes students to real-world applications via well-guided homework programming problems, as well as group projects. Topics include, but are not limited to regression, classification, support vector machines, boosting, crossvalidation, and deep neural networks.
Prerequisites: ST 300-level or above, and ECE 301/CSC 316. Talk to the instructor if not in ECE/CSC.
Course Structure: Course Structure: The course consists of two 75-min lectures and one (optional) 50-min discussion section per week. A teaching assistant will lead the discussion section, covering practice problems and answering questions from students. There will be weekly homework assignments (30%) that contains both written problems and programming problems, two midterm exams (20%×2), and one term project (30%). Programming will be in Python, R, or Matlab. Students are expected to be able to write computer programs and have mathematical maturity in probability theory (e.g., have taken ST371/370) and before taking the course. A linear algebra course such as MA305/405 is recommended while taking the course.
Course Forum: ECE492-45 on Piazza (Zoom link can be found in Piazza. There is only one Zoom link for all activities of this course.)
Homework Submission: Gradescope
Textbooks:Topics: Linear statistical models, Bayesian classifiers, neural networks (NN), support vector machine (SVM), classification/decision tree, clustering, principal component analysis (PCA), naive Bayes, topic model, hidden Markov model (HMM).
Class # | Date | Topic | Lecture notes | Readings | HW Assignment | |
1 | 8/16 | Introduction | Video: Can We Build a Brain? ISLR Ch1–2; ML Supp |
|||
2 | 8/18 | Machine learning overview | Slide deck 1 | ISLR Ch1–2 | HW1 (Due 8/25 on Gradescope) |
|
3 | 8/23 | Linear regression, Matrix-vector form | Scheffe Ch1 | |||
4 | 8/25 | Least squares; Linear algebra | Scheffe App 1 | HW2 (due 9/8) | ||
Deep Learning | ||||||
5 | 8/30 | Geometric interpretation; Modern ML applications (CNN) |
Slide deck 2 | |||
6 | 9/1 | Modern ML applications (CNN) | DL Ch6, Ch9 | HW3 (due 9/15) | ||
9/6 | Labor Day | |||||
7 | 9/8 | Modern ML applications (LSTM, BERT) | DL Ch10 Attention by Futrzynski |
|||
8 | 9/13 | Neural network training: Backpropagation | DL Ch8, Ch11 | |||
Linear Statistical Models: Regression | ||||||
9 | 9/15 | Regression function, Conditional expectation | HT Ch2 | ISLR Ch2; Devore 3.3, 4.2, 2.4; Leon 3.2, 5.7 |
HW4 (due 9/27) | |
10 | 9/20 | Conditional expectation (cont'd), Probability theory review | HT Ch2 | ISLR Ch2; Devore CH6 | HW5 (Due 10/4) | |
11 | 9/22 | Curse of dimensionality, Model accuracy, Bias-variance trade-off |
HT Ch2 | ISLR Ch2 | ||
12 | 9/27 | Confidence interval, Hypothesis test | HT Ch3 | ISLR 3.1, Devore 7.1, 8.1, 8.3 |
HW6 (Due 10/11) | |
13 | 9/29 | Hypothesis test (cont'd), Multiple regression | HT Ch3 | ISLR 3.2, ESL 3.2 | ||
10/4 | Fall Break | |||||
14 | 10/6 | F-statistic, Qualitative predictors, Interaction | HT Ch3 | ISLR 3.3.1–2 | No HW due on 10/20 | |
Classification | ||||||
15 | 10/11 | Logistic regression | HT Ch4 | ISLR 4.1–3, ESL 4.4 | ||
16 | 10/13 | MLE, Invariance principle | Devore 6.2 | HW7 (Due 10/27) Project proposal due on 10/25 |
||
17 | 10/18 | Link function for GLM; Linear discriminant analysis |
HT Ch4 | McCulloch 5.1–4; ISLR 4.4, Leon 6.3.1, 6.4 |
||
18 | 10/20 | Exam 1 | Exam 1 Work on project Interim report due 11/3 |
|||
19 | 10/25 | LDA (cont'd), Error types, ROC, AUC, EER | HT Ch4 | ISLR 4.4.3, Murphy 5.7.2.1 | ||
20 | 10/27 | Naive Bayes; Logistic vs. LDA | ESL 6.6.3, Murphy 3.5; ISLR 4.5; Devore 6.2 |
HW8 (due 11/10) | ||
Other Topics | ||||||
21 | 11/1 | Cross-Validation | HT Ch5 | ISLR 5.1 | ||
22 | 11/3 | Cross-Validation (cont'd); Bootstrap | HT Ch5 | ISLR 5.1; 5.2 | HW9 (due 11/17) | |
23 | 11/8 | Bootstrap (cont'd); Regularization | HT Ch5; HT Ch6 | ISLR 5.2; 6.2 | ||
24 | 11/10 | Regularization (cont'd) | HT Ch9 | ISLR 6.2 | Work on project, prepare for in-class presentation |
|
25 | 11/15 | Support vector machine (SVM) | HT Ch9 | ISLR Ch9 | ||
26 | 11/17 | Unsupervised learning: PCA; K-Means; HMM; Topic model | HT Ch10 | ISLR 10.2, 10.3; Topic model 1, 2 |
||
27 | 11/22 | Exam 2 | Exam 2 | |||
11/24 | Thanksgiving Holiday | |||||
28 | 11/29 | In-class project presentation | Report submission guide (due 12/8) IEEE template |