Christopher Musco

370 Jay St., Brooklyn, NY • Office 1105 • Zoom 'Office' • cmusco [at] nyu.edu • CVGitHubGoogle Scholar

I am an Assistant Professor of Computer Science and Engineering at NYU's Tandon School of Engineering. My students and I are part of the Algorithms and Foundations Group.

My research is on the computational foundations of machine learning and data science. I study efficient methods for processing and understanding large datasets. We often combine ideas from theoretical computer science with tools from computational and applied mathematics.

I was previously a Research Instructor at Princeton. I completed my PhD in computer science at MIT, where I was fortunate to be advised by Jonathan Kelner. Before MIT I was an engineer at Redfin and before that I studied Applied Math and Computer Science at Yale.

News, Events, Travel


Congrats to Dr. Aline Bessa for being awarded a CRA/CCC Computing Innovation Fellowship! Look forward to collaborating with her as a postdoc for the next few years.

We hosted a minisymposium on "Randomized Functional Analysis" at SIAM's 2020 Mathematics of Data Science conference. Links to recorded talks can be found here. I also gave an introduction talk on the topic at DIMACS, which is available on YouTube.

Teaching


Fall 2020, Fall 2019: NYU CS-GY 9223D, Algorithmic Machine Learning and Data Science

Spring 2020: NYU CS-UY 4563, Introduction to Machine Learning

Fall 2018: Princeton COS 521 , Advanced Algorithm Design

Spring 2016: MIT 6.854/18.415, Advanced Algorithms (teaching assistant)

Spring 2016: MIT 6.S977, Technical Communication for Graduate Students (workshop leader)

Other: I am a judge and problem writer for the SIAM/MathWorks M3 Challenge, a math modeling competition for high schoolers (which I participated in as a student). I am happy to chat with students, coaches, or other academics looking to get involved and inspire the next generation of applied mathematicians.

Students


Apoorv Singh, Ph.D. 2020 - present.
Raphael A. Meyer, Ph.D., 2019 - present.
Prathamesh Dharangutte, M.S., 2019 - present.

Research


Graph Learning for Inverse Landscape Genetics
Prathamesh Dharangutte, Christopher Musco
AAAI Conference on Artificial Intelligence (AAAI 2021).
Short version in NeurIPS 2020 AI for Earth Sciences workshop.

Simple Heuristics Yield Provable Algorithms for Masked Low Rank Approximation
Cameron Musco, Christopher Musco, David P. Woodruff
Innovations in Theoretical Computer Science (ITCS 2021).

Hutch++: Optimal Stochastic Trace Estimation
Raphael A. Meyer, Cameron Musco, Christopher Musco, David P. Woodruff
SIAM Symposium on Simplicity in Algorithms (SOSA 2021).
Slides: 50 Minutes.

The Statistical Cost of Robust Kernel Hyperparameter Turning
Raphael A. Meyer, Christopher Musco
Conference on Neural Information Processing Systems (NeurIPS 2020).

Fourier Sparse Leverage Scores and Approximate Kernel Learning
Tamás Erdélyi, Cameron Musco, Christopher Musco
Conference on Neural Information Processing Systems (NeurIPS 2020).
Invited for spotlight presentation.

Near Optimal Linear Algebra in the Online and Sliding Window Models
Vladimir Braverman, Petros Drineas, Cameron Musco, Christopher Musco, Jalaj Upadhyay, David P. Woodruff, Samson Zhou
IEEE Symposium on Foundations of Computer Science (FOCS 2020).

Finding the Mode of a Kernel Density Estimate
Jasper C.H. Lee, Jerry Li, Christopher Musco, Jeff M. Phillips, Wai Ming Tai
Preprint.

Low-Rank Toeplitz Matrix Estimation via Random Ultra-Sparse Rulers
Hannah Lawrence, Jerry Li, Cameron Musco, Christopher Musco
International Conference on Acoustics, Speech, and Signal Processing (ICASSP 2020).

Sample Efficient Toeplitz Covariance Estimation
Yonina C. Eldar, Jerry Li, Cameron Musco, Christopher Musco
ACM-SIAM Symposium on Discrete Algorithms (SODA 2020).
Slides: 50 Minutes.

Fast and Space Efficient Spectral Sparsification in Dynamic Streams
Michael Kapralov, Aida Mousavifar, Cameron Musco, Christopher Musco, Navid Nouri, Aaron Sidford, Jakab Tardos
ACM-SIAM Symposium on Discrete Algorithms (SODA 2020).

Analyzing the Impact of Filter Bubbles on Social Network Polarization
Uthsav Chitra, Christopher Musco
ACM International Web Search and Data Mining Conference (WSDM 2020).
Preliminary version in KDD 2019 workshop.

A Universal Sampling Method for Reconstructing Signals with Simple Fourier Transforms
Haim Avron, Michael Kapralov, Cameron Musco, Christopher Musco, Ameya Velingker, Amir Zandieh
ACM Symposium on Theory of Computing (STOC 2019).
Slides: 60 Minutes. Video: 30 Minutes.

Learning Networks from Random Walk-Based Node Similarities
Jeremy G. Hoskins, Cameron Musco, Christopher Musco, Charalampos E. Tsourakakis
Conference on Neural Information Processing Systems (NeurIPS 2018).
MATLAB Code.

Eigenvector Computation and Community Detection in Asynchronous Gossip Models
Frederik Mallmann-Trenn, Cameron Musco, Christopher Musco
International Colloquium on Automata, Languages, and Programming (ICALP 2018).

Minimizing Polarization and Disagreement in Social Networks
Cameron Musco, Christopher Musco, Charalampos E. Tsourakakis
The Web Conference (WWW 2018).
MATLAB Code, requires CVX library for convex optimization.

Stability of the Lanczos Method for Matrix Function Approximation
Cameron Musco, Christopher Musco, Aaron Sidford
ACM-SIAM Symposium on Discrete Algorithms (SODA 2018).
Slides: 60 Minutes.
Sample code for the version of Lanczos analyzed: lanczos.m

Recursive Sampling for the Nyström Method
Cameron Musco, Christopher Musco
Conference on Neural Information Processing Systems (NeurIPS 2017).
Poster.
MATLAB Code, including accelerated version of the algorithm.

Random Fourier Features for Kernel Ridge Regression: Approximation Bounds and Statistical Guarantees
Haim Avron, Michael Kapralov, Cameron Musco, Christopher Musco, Ameya Velingker, Amir Zandieh
International Conference on Machine Learning (ICML 2017).
Slides: 60 Minutes. Video: 80 minutes

Input Sparsity Time Low-Rank Approximation via Ridge Leverage Score Sampling
Michael B. Cohen, Cameron Musco, Christopher Musco
ACM-SIAM Symposium on Discrete Algorithms (SODA 2017).
Slides: 60 Minutes.

Determining Tournament Payout Structures for Daily Fantasy Sports
Christopher Musco, Maxim Sviridenko, Justin Thaler
SIAM Algorithm Engineering and Experiments (ALENEX 2017).
Invited to special issue of ACM Journal of Experimental Algorithmics for ALENEX.
Slides: 20 Minutes.

Principal Component Projection Without Principal Component Analysis
Roy Frostig, Cameron Musco, Christopher Musco, Aaron Sidford
International Conference on Machine Learning (ICML 2016).
Slides: 15 Minutes.
MATLAB Code for standard algorithm and faster Krylov method analyzed in our recent paper.

Randomized Block Krylov Methods for Stronger and Faster Approximate Singular Value Decomposition
Cameron Musco, Christopher Musco
Conference on Neural Information Processing Systems (NeurIPS 2015).
Invited for full oral presentation.
Slides: 20 Minutes.
MATLAB Code.

Dimensionality Reduction for k-Means Clustering and Low-Rank Approximation
Michael B. Cohen, Samuel Elder, Cameron Musco, Christopher Musco, Mădălina Persu
ACM Symposium on Theory of Computing (STOC 2015).
Slides: 60 Minutes.

We posted a note titled Projection-Cost-Preserving Sketches: Proof Strategies and Constructions which presents simplified proofs of the results in this paper and our 2017 work.

Principled Sampling for Anomaly Detection
Brendan Juba, Fan Long, Christopher Musco, Stelios Sidiroglou-Douskos, Martin Rinard
Network and Distributed System Security Symposium (NDSS 2015).
Slides: 20 Minutes.

Uniform Sampling for Matrix Approximation
Michael B. Cohen, Yin Tat Lee, Cameron Musco, Christopher Musco, Richard Peng, Aaron Sidford
Innovations in Theoretical Computer Science (ITCS 2015).

Single Pass Spectral Sparsification in Dynamic Streams
Michael Kapralov, Yin Tat Lee, Cameron Musco, Christopher Musco, Aaron Sidford
IEEE Symposium on Foundations of Computer Science (FOCS 2014).
Invited to special issue of SIAM Journal on Computing for FOCS, appeared 2017.
Slides: 60 Minutes.

Working with Me


If you are interested in working with me on research, I would encourage you to apply to NYU Tandon's PhD program in Computer Science. I will be taking new students in 2021 (December 2020 application). You should have a strong mathematical background, including in probability theory and linear algebra.

If you are interested in algorithms, especially for data applications, New York City is a unique environment for pursuing a PhD. Beyond NYU's broad investment in data science, the city offers unmatched access to top academic institutions, industry research labs, and startups.

I am also happy to advise masters or undergraduate research at Princeton or NYU. Reach out to learn about existing projects or let me know what topics you are interested in exploring.

Software


Prototype code for some of the algorithms I work on is available through my GitHub page. I am always happy to answer questions about these implementations or to put you in touch with others who have implemented the methods in different languages and environments.