NYU CS-GY 6763 (3943)
Algorithmic Machine Learning
and Data Science

Advanced theory course exploring contemporary computational methods that enable machine learning and data science at scale.


Course Team:

Lectures: Friday 2:00pm-4:30pm, Jacobs (6 MetroTech), Room 674. Recordings available through Brightspace.
Reading group: More info TBA.
Prof. office hours (general): Thursdays 9:00am-10:30am, Zoom link.
Raphael + Apoorv problem solving session: Wednessday 4:30pm-5:30pm, location TBA
Feyza office hours: Fridays 11:00am-12:00pm, 8th Floor common area, 370 Jay.

Syllabus: here.
Gradescope Site: here.
Gradescope Entry Code: JKK5G3
Grading breakdown: Problem Sets 50%, Midterm 20%, Final Project OR Final Exam 20%, Partipation 10%
Final project guidelines: here.

Problem Sets: Problem sets must be turned in via GradeScope. While not required, I encourage students to prepare problem sets in LaTeX or Markdown (with math support.) You can use this template for LaTeX. While there is a learning curve, these tools typically save students time in the end! If you do write problems by hand, scan and upload as a PDF. Collaboration is allowed on homework, but solutions and code must be written independently. Writing should not be done in parallel, and students must list collaborators for each problem separately.

Unless otherwise stated, referencing "non-standard" theorems and proofs not given in class or previous problems is not allowed. All solutions must be proven from scratch. If you are unsure if you can use a fact, ask me or the TAs.

Prerequisites: This course is mathematically rigorous, and is intended for graduate students and advanced undergraduates. Formally we require previous courses in machine learning, algorithms, and linear algebra. Experience with probability and random variables is necessary. See the syllabus for more details and email Prof. Musco if you have questions about your preparation for the course!

Resources: There is no textbook to purchase. Course material will consist of my written lecture notes, as well as assorted online resources, including papers, notes from other courses, and publicly available surveys. Please refer to the course webpage before and after lectures to keep up-to-date as new resources are posted.

Reading Group: It's an exciting time for research at the intersection of algorithm design and the data sciences. Most of the topics covered in this course are still the subject of active research. Starting midway through the semester we will be holding a reading group for students working on final projects (and any others who wish) to discuss and workshop papers.

Problem Sets:
Problem Set 1 (due Tuesday, Sept. 26th by 11:59pm ET).
Problem Set 2 (due Tuesday, Oct. 17th by 11:59pm ET).
Midterm Information (exam on Wednesday, Oct. 20th).
Problem Set 3, UScities.txt (due Tues, Nov. 21st by 11:59pm ET).
Problem Set 4 (due Thursday, Dec. 14th by 11:59pm ET).
Final Exam Information (exam on Friday, Dec. 22nd).

Week # Topic Reading Homework
The Power of Randomness
1. 9/8 Random variables and concentration, Markov's inequality, applications
2. 9/15 Chebyshev inequality and applications
3. 9/22 Exponential tail bounds (Chernoff + Bernstein), Fingerprinting
4. 9/29 High-dimensional geometry, Johnson-Lindenstrauss lemma and dimensionality reduction
5. 10/6 Dimensionality reduction, locality sensitive hashing
Optimization
6. 10/13 Gradient descent and projected gradient descent
  • Lecture 6 notes (annotated).
  • Teal's lecture notes.

  • If you need to freshen up on linear algebra, now is good time! This quick reference from Stanford mostly covers what we need.
  • Good book on optimization which is freely available online through NYU libraries.
  • Good lecture notes from Aleksander Mądry for more reading on analyzing gradient descent.
  • Moritz Hardt's lecture notes with proofs of gradient descent convergence in all the regimes discussed in class.
  • Sébastien Bubeck's convex optimization book mentioned in class. Fairly technical, but great reference.
7. 10/20 Midterm Exam (first half of class)

Fine-grained Complexity.
8. 10/27 Second order conditions
9. 11/3 Online and stochastic gradient descent, center of gravity method
Spectral Methods and Linear Algebra
10. 11/10 Ellipsoid method, LP relaxation, begin on Singular value decomposition
11. 11/17 Krylov subspace methods for SVD, start on spectral graphs theory
11/24 NO CLASS. THANKSGIVING BREAK
12. 12/1 Spectral clustering, stochastic Block Model, subspace embeddings and ε-nets
Fourier Methods
13. 12/8 Fast Johnson-Lindenstrauss Lemma, introduction to sparse recovery and compressed sensing
14. 12/15 Finish up compressed sensing, introduction to importance sampling and leverage scores.
15. 12/22 Final Exam (during regular class time)