NYU CS-GY 6763 (3943)
Algorithmic Machine Learning
and Data Science

Advanced theory course exploring contemporary computational methods that enable machine learning and data science at scale.


Course Team:

Christopher Musco
Professor
Christopher Musco
Aarshvi Gajjar
Course Assistant
Aarshvi Gajjar
Indu Ramesh
Course Assistant
Indu Ramesh
Teal Witter
Course Assistant
Teal Witter

Lectures: Rogers Hall, Room 707. Recordings available through Brightspace.
Professor office hours: Weekly on Mondays 9am-11am. Zoom link.
TA office hours (general): 1:30-3pm on Wednesdays. 8th floor common area, 370 Jay St.
TA office hours (undergrad only): 1-3pm on Thursdays. 8th floor common area, 370 Jay St.

Syllabus: here.
Grading breakdown: Quizzes 10%, Problem Sets 45%, Midterm 15%, Final project OR final exam 20%, Partipation 10%
Final project guidelines: here.

Quizzes: Weekly check-in quizzes will be administered via Google Forms. Link will be posted on this site. They must be completed by 2:00pm ET the Tuesday. after they are posted.

Problem Sets: Problem sets must be turned in via Gradescope on NYU Brightspace. While not required, I encourage students to prepare problem sets in LaTeX or Markdown (with math support.) You can use this template for LaTeX. While there is a learning curve, these tools typically save students time in the end! If you do write problems by hand, scan and upload as a PDF. Collaboration is allowed on homework, but solutions and code must be written independently. Writing should not be done in parallel, and students must list collaborators for each problem separately. See the syllabus for details.

Prerequisites: This course is mathematically rigorous, and is intended for graduate students and advanced undergraduates. Formally we require previous courses in machine learning, algorithms, and linear algebra. Experience with probability and random variables is necessary. See the syllabus for more details and email Prof. Musco if you have questions about your preparation for the course!

Resources: There is no textbook to purchase. Course material will consist of my written lecture notes, as well as assorted online resources, including papers, notes from other courses, and publicly available surveys. Please refer to the course webpage before and after lectures to keep up-to-date as new resources are posted.

Reading Group: It's an exciting time for research at the intersection of algorithm design and the data sciences. Most of the topics covered in this course are still the subject of active research. Starting midway through the semester we will be holding a reading group for students working on final projects (and any others who wish) to discuss and workshop papers.

Problem Sets:
Problem Set 1 (due Monday, Sept. 27th by 11:59pm ET).

Week # Topic Reading Homework
The Power of Randomness
1. 9/7 Random variables and concentration, Markov's inequality, applications
  • Lecture 1 notes (annotated).

  • Probability review! None of this should be new, but you might want to brush up if you haven't taken prob/stat in a while. Indu recommends this resource for review.
  • Typed lecture notes covering another application of Markov's inequality to analyzing hashing, and proving universality of random linear hash function.
  • Interesting paper on applications of mark-and-recapture to network size estimation, and some cool improve methods.
2. 9/14 Chebyshev inequality, exponential tail bounds (Chernoff + Bernstein), and applications
  • Lecture 2 notes (annotated).

  • Original paper giving a loglog(n) space algorithm for the distinct elements problem. Follow-up work on state-of-the-art Hyperloglog algorithm.
  • Additional reading on concentration bounds can be found in Terry Tao's notes.
  • For a proof of the "power of two choices'' result, see Section 2 in this survey
  • My favorite proof of the Union Bound can be found here.
3. 9/21 High-dimensional geometry and the Johnson-Lindenstrauss lemma
4. 9/28 Locality sensitive hash functions, applications to near neighbor search
Optimization
5. 10/5 The role of convexity, gradient descent and projected gradient descent
10/12 NO CLASS -- MONDAY SCHEDULE
6. 10/19 Online and stochastic gradient descent, coordinate descent, preconditioning
7. 10/26 Midterm Exam (first half of class)

Constrained optimiziation, center of gravity method, linear programming
8. 11/2 Discrete optimization, LP relaxation, submodularity and greedy methods
Spectral Methods and Linear Algebra
9. 11/9 Singular value decomposition, Krylov methods, and a taste of approximation theory
10. 11/16 Spectral graph theory, spectral clustering, generative models for networks
11. 11/23 Randomized numerical linear algebra, sketching for linear regression, ε-nets
Fourier Methods
12. 11/30 The Fourier transform and applications, random Fourier features
13. 12/7 Sparse recovery and compresses sensing, restricted isometry property
14. 12/14 Vote on a topic related to my recent research.