STA 7934: Advanced Regression
Lecture. 12:50pm - 1:40pm on Monday, Wednesday, and Friday on Zoom
Instructor. Aaron Molstad (amolstad@ufl.edu), 202 Griffin-Floyd
Office hours. Monday, Wednesday from 1:45pm - 2:45pm on Zoom
Syllabus. [pdf]
Lecture and office hours. [Zoom link]
Note that you must be logged into your UFL eLearning account to access course materials.
Lecture | Topics | |||
0 (8/31) | Syllabus, course overview | |||
1 (9/2) | OLS, bias-variance tradeoff, ridge regression [slides][lecture] | |||
2 (9/4) | Ridge regression computation, lasso introduction [slides][lecture] | |||
3 (9/9) | Lasso computation [slides][lecture] | |||
4 (9/11) | Lasso computation continued [slides][lecture][errata] | |||
5 (9/14) | Lasso variants [slides][lecture][errata] | |||
6 (9/16) | Consistency of the lasso [slides][lecture] | |||
7 (9/18) | Support recovery of the lasso [slides] |
Homework 1 (due Monday, September 21st): [pdf][submission]
Linear regression notes [pdf]
8 (9/21) | Basis expansions and splines [slides][lecture] | |||
9 (9/23) | Natural cubic splines [slides][lecture][errata] | |||
10 (9/25) | Smoothing splines [slides][lecture] | |||
11 (9/28) | Reproducing kernel Hilbert spaces, representer theorem [slides][lecture] | |||
12 (9/30) | Kernels, kernel ridge regression [slides][lecture] | |||
13 (10/2) | Kernel smoothers and nearest neighbors regression [slides][lecture] |
Homework 2 (due Monday, October 12th) [pdf][hint]
Semiparametrics notes [pdf]
14 (10/5) | Linear methods for classification [slides][lecture] | |||
15 (10/7) | Linear and quadratic discriminant analysis [slides] [lecture] | |||
16 (10/9) | Fisher’s linear discriminant analysis [slides][lecture] | |||
17 (10/12) | Separating hyperplanes [slides][lecture] | |||
18 (10/14) | Support vector machines [slides][lecture] | |||
19 (10/16) | Beyond SVMs [slides][lecture] |
Homework 3 (due Monday, October 26th) [pdf][submission]
Classification notes [pdf]
20 (10/19) | Generalized additive models [pdf][lecture] | |||
21 (10/21) | Sparse additive models [pdf][lecture] | |||
22 (10/23) | Regression trees [pdf][lecture] | |||
23 (10/26) | Bagging and random forests [pdf][lecture][errata] | |||
24 (10/28) | Variable importance [pdf][lecture] | |||
25 (10/30) | Boosting [pdf][lecture] | |||
26 (11/2) | Exponential loss and gradient boosting [pdf][lecture] | |||
27 (11/9) | Gradient boosting trees [pdf][lecture] | |||
28 (11/13) | Ensemble methods [pdf][lecture] |
Project proposal (due Monday, November 2nd)[submission]
Generalized additive model notes [pdf]
Tree-based methods notes [pdf]
Ensemble methods notes [pdf]
Homework 4 [pdf][submission]
Homework 5 [pdf][submission]
29 (11/16) | Neural networks [pdf][lecture] | |||
30 (11/18) | Fitting neural networks with SGD [pdf][lecture] | |||
31 (11/20) | Deep neural networks and double descent [pdf][lecture] |
Neural networks notes [pdf]
32 (11/23) | Gaussian graphical models [pdf][lecture] | |||
33 (11/30) | Glasso algorithm and covariance matrix estimation [pdf][lecture] | |||
34 (12/2) | Reduced rank regression [pdf][lecture] | |||
35 (12/4) | Regularized multivariate regression [pdf] |
Homework 6 [pdf][submission]
Project rubic [pdf][submision]