Weekly schedule, lecture notes, R code and extra readings.
Week 1: Introduction and Preliminaries Jan 17 – Jan 21
- Our first week of lecture will be online. Please use this [zoom
link, pw:542].
- Please read syllabus, exam schedule and homework requirements
- Complete [HW1] and review preliminaries if
you find it challenging
- Lecture Note:
- R Lab:
- Zoom Recording:
Week 2: KNN and the Bias-variance trade-Off Jan 24 – Jan 28
- We meet in-class starting from week 2.
- Lecture Note
- R Lab
- Additional Reading
- KNN from SMLR with some more R examples
- Ch 2.3, 2.9 in HTF
- Friedman, J. H. (1997). On bias, variance, 0/1—loss, and the
curse-of-dimensionality. Data mining and knowledge discovery, 1(1),
55-77.
- Devroye, L., Gyorfi, L., Krzyzak, A., & Lugosi, G. (1994). On
the strong universal consistency of nearest neighbor regression function
estimates. The Annals of Statistics, 1371-1385.
- Belkin, M., Hsu, D., Ma, S., & Mandal, S. (2019). Reconciling
modern machine-learning practice and the classical bias–variance
trade-off. Proceedings of the National Academy of Sciences, 116(32),
15849-15854.
Week 3: Linear Models and Model Selection Jan 31 – Feb 4
- Lecture Note
- R Lab
- Zoom Recording
- Additional Reading
- SMLR
- Ch 3.2, 3.3, 7.3, 7.7 in HTF
Week 4: Numerical Optimization Feb 7 – Feb 11
- Lecture Note
- R Lab
- Additional Reading
Week 5: Penalized Linear Models Feb 14 – Feb 18
- Lecture Note
- R Lab
- Video Recording
- Additional Reading
- Leave-one-out CV of Linear regression
- Ch 3.4, 3.6 in HTF
- Friedman, Jerome, Trevor Hastie, and Rob Tibshirani. “Regularization
paths for generalized linear models via coordinate descent.” Journal of
statistical software 33, no. 1 (2010): 1.
Week 6: Spline Feb 21 –
Feb 25
- Lecture Note
- R Lab
- Additional Reading
Week 7: Kernel Density Estimation and Local Smoothing
Feb 28 – Mar 4
- Lecture Note
- R Lab
- Additional Reading
Week 8: Classification Basics Mar 7 – Mar 11
Week 9: Spring Break
Week 10: Support Vector Machine Mar 21 – Mar 25
- Lecture Note
- R Lab
- Additional Reading
Week 11: Kernel SVM, RKHS and Kernel Ridge
Regression Mar 28 – Apr 1
- Lecture Note
- Additional Reading
- Zhang, Yuchen, Duchi, John, and Wainwright, Martin “Divide and
conquer kernel ridge regression.” In Conference on learning theory,
pp. 592-617. PMLR, 2013.
Week 12: Trees and Random Forests Apr 4 – Apr 8
- Lecture Note
- R Lab
- Additional Reading
- Geurts, Pierre, Damien Ernst, and Louis Wehenkel. “Extremely
randomized trees.” Machine learning 63, no. 1 (2006): 3-42.
Week 13: Boosting Apr 11 –
Apr 15
- Lecture Note
- R Lab
- Additional Reading
Week 14: Clustering Algorithms Apr 18 – Apr 22
- Lecture Note
- R Lab
- Additional Reading
- Ch 14.3.4, 14.3.6, 14.3.12 in HTF
Week 15: EM Algorithm and Hidden Markov Model Apr 25 – Apr 29
- Lecture Note
- R Lab
- Additional Reading
- Ch 8.5 in HTF
- Eddy, S. ``What is a hidden Markov model?’’ Nat Biotechnol 22,
1315–1316 (2004). [Link]