NYU CS-UY 4563
Introduction to Machine Learning

A broad introduction to the exciting field of machine learning through a mixture of hands-on experience and theoretical foundations.


Course Team:

Christopher Musco
Professor
Christopher Musco
Prathamesh Dharangutte
Teaching Assistant
Prathamesh Dharangutte
Zige Wang
Teaching Assistant
Zige Wang
Raphael Meyer
Instructor Assistant
Raphael Meyer


Administrative Information

Lectures: Mon./Wed. 9:00-10:20am. Zoom link on NYU Classes.

Syllabus: Available here. Please review carefully!

Zige's Office hours: Thurs., 11am-11pm, Permanent Zoom link.
Prathamesh's Office hours: Thurs., 11am-1pm, Permanent Zoom link.
Raphael's Office hours: Tues., 3-5pm, Permanent Zoom link.
Professor Office hours: Wed., 1-3pm, Permanent Zoom link.

Git repository: All assignments and labs should be downloaded from our GitHub repository. To stay organized, we suggest students access material from this repository using a git client. Instructions can be found here.

Piazza: Course announcements will be made over Piazza, so please create an account and join our site. Unless asked in person, all questions should be posted to Piazza (not sent via emails). We prefer that questions about lectures or homework are asked publicly, since they will often help your classmates, but Piazza supports private questions for things relevant only to you.

Homework: Homework (both written problems and coding labs) must be turned in to NYU Classes by the specified due date. Labs should be turned in as evaluated Jupyter notebooks. Do not clear the output before turning in. While not required, I encourage students to prepare problem sets in LaTeX or Markdown (with math support.) You can use this template for LaTeX. While there is a learning curve for LaTeX (less for Markdown), it typically saves students time by the end of the semester! If you write problems by hand, please scan and upload as a PDF (using a scanner app on your phone is fine).

For homework problems collaboration is allowed, but solutions and any code must be written independently. Students must list collaborators on their problem sets (for each problem separately). See the syllabus for full details.


Course Summary

Coursework: Weekly programming labs and written problems sets (30% of grade). Two in-class midterm exams on March 9th and April 20th (20% of grade each). Final open-ended project with a partner (20% of grade). Class participation is the remaining 10% of the grade. Please consult the formal syllabus for more information. Different students contribute in different ways, so you don't need to be vocal in class to get a good class participation grade. We also count participation in office hours, on Piazza, etc.

Project: Please refer to the information sheet for requirements and deadlines.

Python and Jupyter: Demos and labs in this class use Python, run through Jupyter notebooks. Jupyter lets you create and edit documents with live Python code and rich comments and images. You can install Python and Jupyter notebooks on any personal computer, or run it in the cloud via Google Colaboratory. Instructions can be found here.

Prerequisites: Modern machine learning uses a lot of math! Probably more than any other subject in computer science that you will study as an undergraduate. You can get pretty far with an understanding of just probability and linear algebra, but that understanding needs to be solid for you to succeed in this course. Formally we require a prior course in Algorithms and Data Structures, a course in Probability or Statistics, and a course in Linear Algebra.

Resources: There is no textbook to purchase. I may post readings, some of which will come from the following books which are available as free PDFs:

I have also found the lectures and notes from the companion site to the book Learning With Data (ISBN 978-1-60049-006-4) can be very helpful. This is a compact, inexpensive book if you want to purchase.

Lecture # Topic Reading Homework
Regression and Function Fitting
1. 1/27 Introduction to Machine Learning
2. 1/29 Simple Linear Regression, Loss Functions
3. 2/3 Multiple Linear Regression, Data Transformations
4. 2/5 Finish Multiple Linear Regression, Model Selection, Cross Validation
5. 2/10 Model selection, regularization.
  • Lecture 5 slides.
  • Caltech Lecture 12 on regularization. It might be helpful to watch the video for Lecture 8 first to understand what he means by "bias-variance" tradeoff (which we did not cover formally).
  • Work through polynomial regression demo: demo_polyfit.ipynb.
Classification
6. 2/12 Naive Bayes, the Bayesian Perspective
2/17 Presidents Day, No Class
7. 2/19 The Bayesian Perspective cont.
8. 2/24 Bayesian Regression, Linear Classifiers
  • Lecture 8 slides (annotated).
  • Section 3 of these notes gives a nice overview of least squares regression from a statistical/probabilistic modeling perspective. Also see section on logistic regression.
9. 2/26 Logistic Regression
10. 3/2 Optimization, Gradient Descent
11. 3/4 Finish Gradient Descent, Midterm Review
3/9 Midterm 1 Will cover lectures 1 - 10. Two-sided cheat-sheet allowed.
12. 3/11 k-Nearest Neighbors, Kernel Methods
Beyond Linear Methods: Neural Networks
3/16 Spring break, no class.
3/18 Spring break, no class.
13. 3/23 Kernel Methods
  • Complete Lab 5, lab_grad_descent_partial.ipynb. Due 11:59pm, Thurs. 4/2.
  • The first half of this lab is a demo, which you should go through slowly. The parts you actually have to fill in don't start until the "L2 Regularization" section.
3/25 CANCELED
14. 3/30 Support Vector Machines
15. 4/1 Neural Networks 1: Introduction
  • Email cmusco@nyu.edu with group members, proposed project topic, and preliminary ideas on data sources. Due 11:59pm, 4/1.
  • Set up appointment with Prof. Musco using this spreadsheet.
  • Complete Lab 6, lab_mnist_partial.ipynb. Due 11:59pm, Thurs. 4/9.
16. 4/6 Neural Networks 2: History + Backprop
17. 4/8 Neural Networks 3: Finish Backprop, Stochastic Gradient Descent
18. 4/13 Convolution, Feature Extraction, Edge Detection, Feature Transfer
19. 4/15 Convolutional networks, deep learning
  • Work through demo on convolutional networks for image classification: demo_classification.ipynb
  • You will want to rune this demo on a machine with GPU access. Probably the simplest option is to use Google Colab!
Unsupervised Learning
4/20 Auto-encoders, Dimensionality Reduction
20. 4/22 Auto-encoder applications, Principal Component Analysis
  • Lecture 21 slides (annotated).
  • Check out this research project to get a sense of some of the amazing things people are doing in generative ML.
  • If you have not seen lossless compression in a class before, I strongly recommend you watch a video on Huffman coding to give you a better sense of how it works. Data compression is a super interesting and practically important topic in CS!
  • Signup for project presentation slot here.
21. 4/27 Finish PCA
22. 4/29 Semantic Embeddings, Clustering
Selected Topics
23. 5/4 Introduction to Reinforcement Learning
5/6 Project Presentations
5/11 Project Presentations