NYU CS-GY 9223I
Algorithmic Machine Learning
and Data Science
Advanced topics course exploring contemporary algorithmic techniques and recent research on computational methods that enable machine learning and data science at scale.
Course Instructor: Professor Christopher Musco
Prerequisites: This course is mathematically challenging, and is intended for graduate students or advanced undergraduates. I require a previous course in machine learning (for example, CS-UY 4563, CS-GY 6923, or ECE-GY 6143) and a previous course in algorithm design and analysis (for example, CS-UY 2413, CS-GY 6033, or CS-GY 6043). Experience with linear algebra and probability is also necessary. Email the course instructor if you have questions about your preparation for the course!
Coursework: One lecture per week. 4-5 problem sets involving mathematical analysis of algorithms and programming exercises to explore the lecture ideas (40% of grade). Midterm exam (20% of grade). Final exam (40% of grade). A formal syllabus with more detailed information will be posted in the fall.
Resources: There is no textbook to purchase. Course material will consist of my written lecture notes, as well as assorted online resources, including papers, notes from other courses, and publicly available surveys. Please refer to the course webpage before and after lectures to keep up-to-date as new resources are posted.
Lectures: Wednesdays 3:20-5:50pm in 2 Metrotech Center, Room 9.011.
Office hours: time and place TBA
Reading Group: time and place TBA
Homework: While not required, I encourage students to prepare problem sets in LaTeX or Markdown (with math support.) You can use this template for LaTeX. While there is a learning curve for LaTeX (less for Markdown), it typically saves students lots of time by the end of the semester!For regular homework problems collaboration is allowed, but solutions and any code must be written independently. Students must list collaborators on their problem sets (for each problem separately).
Optional Reading Group: It's an exciting time for research at the intersection of algorithm design and the data sciences. Many of the ideas covered in this course are still the subject of active research. For each lecture, I will choose a relevant, recent research paper. Interested students will have the option of reading the paper and discussing each week for 1-2 hours in a collaborative and relaxed environment.
|Lecture #||Topic||Required Reading||Optional Reading|
|The Power of Randomness|
|1. 9/4||Concentration of random variables + applications to hashing and data streams|
|3. 9/11||The Johnson-Lindenstrauss lemma + applications to high dimensional data|
|3. 9/18||Nearest neighbor search + locality sensitive hashing|
|4. 9/25||Convexity in machine learning + vanilla, stochastic, and online gradient descent|
|5. 10/2||Conditioning, acceleration, coordinate descent, quasi-Newton methods|
|6. 10/9||Beyond convexity: training neural networks and other non-convex models|
|7. 10/16||Learning from experts + multiplicative weights|
|8. 10/23||Constrained optimization, linear + semidefinite programming, interior point methods.|
|Spectral Methods and Linear Algebra|
|9. 10/30||Singular value decomposition, Krylov methods + application to spectral clustering|
|10. 11/6||Randomized linear algebra, sketching for linear regression, ε-nets arguments|
|11. 11/13||Importance sampling, leverage scores, active learning|
|12. 11/20||The Fourier transform, DFT, FFT + applications|
|11/27||No Class, Thanksgiving|
|13. 12/4||Compressed sensing, the restricted isometry property, basis pursuit|
|14. 12/11||Kernel methods in machine learning, random Fourier features|