NYU CS-GY 6923 Machine Learning
A broad introduction to the exciting field of machine learning through a mixture of hands-on experience and theoretical foundations.
Special Semester on Generative Machine Learning
Course Team:
Lectures: Friday 11:00am-1:30pm, 2 Metrotech, Room 911. Recordings available through Brightspace.
Prof. office hours (general): Mondays 11:00am-12:30pm, Zoom link.
Aarshvi office hours: Mondays 4:00pm-5:00pm , Zoom link.
Atsushi office hours: Tuesdays 11:00am-12:00pm, 8th floor common area, 370 Jay.
Grading breakdown: Written Problem Sets 20%, Programming Labs 20%, Midterm 25%, Final Exam 25%, Participation 10%
Ed Discussion: All course communication will be via EdStem, so please join our site from Brightspace. All questions should also be posted to Ed (not sent via emails). We prefer that lectures or homework questions are asked publicly, since they will often help your classmates. Ed also supports private questions for things relevant only to you.
Python and Jupyter: Demos and labs in this class use Python, run through Jupyter notebooks. Jupyter lets you create and edit documents with live Python code and rich comments and images. We suggest that students run their Jupyter notebooks via Google Colaboratory, and we will share them via Colab.
Prerequisites: Modern machine learning uses a lot of math! Probably more than any other subject in computer science outside theoretical computer science. You can get pretty far with an understanding of just calculus, probability, and linear algebra, but that understanding needs to be solid for you to succeed in this course. Formally we require a prior course in probability or statistics. If you need to freshen up on linear algebra, this quick reference from Stanford is helpful.
Homework: Homework (both written problems and coding labs) must be turned in to Gradescope by the specified deadline. Use the code V5XZJ5 to join the class on Gradescope. We do not accept late work without prior permission.
Labs should be turned in as evaluated Jupyter notebooks. Do not clear the output before turning in.
While not required, for written problem sets I encourage students to prepare problem sets in LaTeX or Markdown (with math support.)
You can use this template for LaTeX. While there is a learning curve, these tools typically save students time in the end! If you do write problems by hand, scan and upload as a PDF.
Discussion is allowed on homework, but solutions and code must be written independently. See the syllabus for details. We have a zero tolerance policy for copied code or solutions: any students with duplicate or very similar material will receive a zero on the offending assignment. My advice is to never share code or solutions with other students.
Resources: There is no textbook to purchase. I may post readings, some of which will come from the following book, which is available free online via the NYU library:
- An Introduction to Statistical Learning by James, Witten, Hastie, and Tibshirani.
Lecture # | Topic | Reading | Homework |
---|---|---|---|
Function Fitting and Regression | |||
1. 1/27 | Introduction to Machine Learning, Loss Functions, Simple Linear Regression, Multiple Linear Regression |
|
|
2. 2/3 | Finish Multiple Linear Regression, Data Transformations, Model Selection, Regularization |
|
|
Bayesian Methods and Probabilistic Models | |||
3. 2/10 | Naive Bayes |
|
|
4. 2/17 | More on the Bayesian Perspective, Modeling Language |
|
|
Classification | |||
5. 2/24 | K-nearest neighbors, Logistic Regression, Optimization |
|
|
6. 3/3 | Gradient Descent, Stochastic Gradient Descent |
|
|
7. 3/10 |
Midterm Exam
The midterm will be design to be a 1:15 minute exam, but you will have the entire class period. |
||
3/17 | Spring break, no class. | ||
8. 3/24 | Finish Gradident Descent, Introduction to Learning Theory, the PAC model |
|
|
Beyond Linear Methods | |||
9. 3/31 | Boosting | ||
10. 4/7 | Introduction to Neural Nets, Backpropagation | ||
11. 4/14 | Convolution, Feature Extraction, Transfer Learning | ||
Unsupervised Learning | |||
12. 4/21 | Auto-encoders, Principal Component Analysis, Semantic Embeddings | ||
13. 4/28 | Variational Auto-encoders, Generative adversarial networks | ||
14. 5/5 | Self-supervised learning, Diffusion Models | ||
5/12 | Final Exam |