I am a 5th year PhD student in computer science at MIT, graduating in the spring of 2018.
I am fortunate to be advised by Jonathan Kelner. My research is in algorithmic data science and machine learning. I study computational methods for processing large graphs and matrices.
I draw on tools from dimensionality reduction, randomized numerical linear algebra, and optimization, with a focus on methods that adapt to modern computational environments, including streaming and distributed frameworks.
Our paper on Recursive Sampling for the Nyström Method will appear in NIPS 2017. Code for the algorithm is available on GitHub. Also check out our upcoming SODA 2018 paper on the Lanczos method: it pins down the stability of Lanczos for applying matrix polynomials, proving correct our method for Principal Component Regression that combines Lanczos with any iterative or stochastic system solver.
Stability of the Lanczos Method for Matrix Function Approximation Cameron Musco, Christopher Musco, Aaron Sidford. ACM-SIAM Symposium on Discrete Algorithms (SODA 2018). Sample code for the version of Lanczos analyzed: lanczos.m
Recursive Sampling for the Nyström Method Cameron Musco, Christopher Musco. Conference on Neural Information Processing Systems (NIPS 2017). Poster: Poster. MATLAB Code, including accelerated version of the algorithm.
Random Fourier Features for Kernel Ridge Regression: Approximation Bounds and Statistical Guarantees Haim Avron, Michael Kapralov, Cameron Musco, Christopher Musco, Ameya Velingker, Amir Zandieh. International Conference on Machine Learning (ICML 2017). Slides: 60 Minutes.
Input Sparsity Time Low-Rank Approximation via Ridge Leverage Score Sampling Michael B. Cohen, Cameron Musco, Christopher Musco. ACM-SIAM Symposium on Discrete Algorithms (SODA 2017). Slides: 60 Minutes.
Determining Tournament Payout Structures for Daily Fantasy Sports Christopher Musco, Maxim Sviridenko, Justin Thaler. SIAM Algorithm Engineering and Experiments (ALENEX 2017). Slides: 20 Minutes.
Principal Component Projection Without Principal Component Analysis Roy Frostig, Cameron Musco, Christopher Musco, Aaron Sidford. International Conference on Machine Learning (ICML 2016). Slides: 15 Minutes. MATLAB Code for standard algorithm and faster Krylov method analyzed in our recent paper.
Randomized Block Krylov Methods for Stronger and Faster Approximate Singular Value Decomposition Cameron Musco, Christopher Musco. Conference on Neural Information Processing Systems (NIPS 2015). Invited for full oral presentation (1 of 15 out of 403 accepted papers). Slides: 20 Minutes. MATLAB Code.
Dimensionality Reduction for k-Means Clustering and Low-Rank Approximation Michael B. Cohen, Samuel Elder, Cameron Musco, Christopher Musco, Madalina Persu. ACM Symposium on Theory of Computing (STOC 2015). Slides: 20 Minutes, 60 Minutes.
Principled Sampling for Anomaly Detection Brendan Juba, Fan Long, Christopher Musco, Stelios Sidiroglou-Douskos, Martin Rinard. Network and Distributed System Security Symposium (NDSS 2015). Slides: 20 Minutes.
Uniform Sampling for Matrix Approximation Michael B. Cohen, Yin Tat Lee, Cameron Musco, Christopher Musco, Richard Peng, Aaron Sidford. Innovations in Theoretical Computer Science (ITCS 2015).
Single Pass Spectral Sparsification in Dynamic Streams Michael Kapralov, Yin Tat Lee, Cameron Musco, Christopher Musco, Aaron Sidford. IEEE Symposium on Foundations of Computer Science (FOCS 2014). Invited to SIAM Journal on Computing Special Issue for FOCS, appeared 2017. Slides: 20 Minutes, 60 Minutes.
Prototype code for some of the algorithms I work on is freely available through my GitHub page. I am always happy to answer questions about these implementations or to put you in touch with others who have implemented the methods in different languages and environments.
If you're an MIT affiliate running into algorithmic problems in your research, check out our "Algorithms Office Hours". You can submit a description of your project and we'll set you up to meet with algorithms researchers in CSAIL who may be able to offer some guidance.
I'm an advisor with the EECS Graduate Communication Lab. We offer peer reviewing and coaching for research papers, fellowship and job applications, CVs, talks, and any other writing task or presentation. MIT affiliates can schedule an appointment with me or any other advisor here.
My Master's thesis was titled Dimensionality Reduction for Sparse and Structured Matrices.