Theory of Machine Learning, Spring 2021
CSCI 7000, APPM 5590, APPM 4490
University of Colorado, Boulder
Instructor: Bo Waggoner
Time: Tue/Thu 8:00am - 9:15am
Meets synchronously, remotely (BigBlueButton link is on Canvas page)
Office hours (on BigBlueButton, link in Canvas): Tuesday 9:15am - 10:30am; Thursday 4pm - 5pm
Course information and syllabus
Presents the underlying theory behind machine learning in proofs-based format. Answers fundamental questions about what learning means and what can be learned via formal models of statistical learning theory. Analyzes some important classes of machine learning methods. Specific topics may include the PAC framework, VC-dimension and Rademacher complexity. Recommended prerequisites: APPM 4440 and CSCI 5622.
The course will be similar to Prof. Stephen Becker's version.
Resources and Links
- Thu, Jan 14: Intro, what is (theory of) ML? Estimating coin bias.
Reading:[SSS] Ch1 and Ch2 for next time.
HW: Sign up for Zulip.
- Tue, Jan 19: Statistical learning setting, ERM, PAC setting. Papaya example from [SSS], axis-aligned rectangles.
- Thu, Jan 21: PAC setting and variants; PAC-learnability of finite hypothesis classes.
Reading:[SSS] Ch3 and Ch4 for next time.
HW: homework 1 posted on Canvas, due Mon. Feb. 1 at 11:59pm to gradescope.
- Tue, Jan 26: Uniform convergence, agnostic PAC learnability of finite classes.
- Thu, Jan 28: Finish uniform convergence; begin bias-complexity tradeoff.
Reading:[SSS] Ch5 for next time.
- Tue, Feb 2: Proof of no-free lunch theorem and bias-variance tradeoff.
- Thu, Feb 4: Rademacher complexity.
Reading:[Mohri] Ch3.1 for next time.
HW: homework 2 posted on Canvas, due Mon. Feb. 15 at 11:59pm to gradescope.
- Tue, Feb 9: More Rademacher complexity; generalization theorem.
- Thu, Feb 11: Bounding RC using growth function; start VC-dimension.
Reading: [Mohri] Ch3.2, 3.3; [SSS] Ch6.
- Tue, Feb 16: shattering, VC-dimension; bounding the growth function, Sauer's Lemma.
- Thu, Feb 18: Sauer-Shelah and agnostic PAC-learnability of finite-VC classes; fundamental theorem of PAC learning; mention fat-shattering and pseudo-dimension; structural risk minimization.
- Tue, Feb 23: Structural risk minimization; intro to online learning.
- Thu, Feb 25: Online learning model; example settings; mistake-bounded learning; regret.
Reading: [OCO] Ch1, Ch2.
- Tue, Mar 2: Convexification; properties of convex functions; FTL.
- Thu, Mar 4: FTL, failure of FTL. FTRL.
- Tue, Mar 9: project overview; FTRL; FTRL regret bound.
- Thu, Mar 11: FTRL regret bound concluded; examples, multiplicative weights and online gradient descent.
Homework: 1-paragraph project proposal/idea, by Wed Mar 17, 11:59pm.
- Tue, Mar 16: more on exponential weights; intro to zero-sum games and their special properties.
- Thu, Mar 18: von Neumann min-max theorem and proof using FTRL; mentioning generalizations.
Reading: lecture notes on zero-sum games.
- Tue, Mar 23: Blackwell approachability.
Reading (or at least, resource): Abernethy lecture notes.
- Tue, Mar 30: Boosting; as a zero-sum game; via multiplicative weights.
- Thu, Apr 1: Generalization of boosting; online-to-batch conversion for online learning. Start SVMs, definition of margin.
Reading: [SSS] Ch 15.
- Tue, Apr 6: Support Vector Machines. Margin, Hard-SVM, quadratic program. Generalization guarantee depending only on margin, not dimension (without proof).
- Thu, Apr 8: Soft-SVM (i.e. non-separable). Connection to hinge loss. Margin result.
HW5 and HW6 assigned, due Wed Apr 21.
- Tue, Apr 13: Kernels: definition, motivating example. Representation theorem. Gaussian kernel.
Reading: [SSS] Ch 16.
- Thu, Apr 15: Finish kernels; define RKHS. Start bandits: setting, motivation, comparison to online learning setting. Explore-exploit based algorithms and general tradeoffs.
Reading: [Slivkins] Ch 1.
- Tue, Apr 20: Analyzing bandit algorithms; UCB.
Reading: [Slivkins] Ch 3.
- Thu, Apr 22: (Hopefully) Bayesian bandits.
- Tue, Apr 27: Project presentations.
- Thu, Apr 22: Project presentations.