Qing Qu is an assistant professor in the ECE Division of the Electrical Engineering and Computer Science department of the College of Engineering, University of Michigan – Ann Arbor. He is also affiliated with the Michigan Center for Applied and Interdisciplinary Mathematics (MCAIM).
He received his B.E. degree from Tsinghua University, Beijing, China, in 2011, and obtained his Ph.D. degree from Columbia University with Prof. John Wright in 2018. He was a Moore-Sloan fellow at NYU Center for Data Science from 2018 to 2020. He is the recipient of Best Student Paper Award at SPARS’15 (with Ju Sun and John Wright), and the recipient of the 2016 Microsoft Ph.D. Fellowship in machine learning.
His research interest lies in the intersection of signal processing, data science, machine learning, and numerical optimization. He is particularly interested in computational methods for learning low-complexity models from high-dimensional data, leveraging tools from machine learning, numerical optimization, and high dimensional geometry, with applications in imaging sciences, scientific discovery, and healthcare. Recently, he is also interested in understanding deep networks through the lens of low-dimensional modeling. You can learn more about his research here.
Zhihui Zhu (equal), Tianyu Ding (equal), Jinxin Zhou, Xiao Li, Chong You, Jeremias Sulam, Qing Qu (2021). A Geometric Analysis of Neural Collapse with Unconstrained Features. In Submission.
Preprint – PDF – BibTex
Qing Qu, Yuexiang Zhai, Xiao Li, Yuqian Zhang, Zhihui Zhu (2020). Analysis of the Optimization Landscapes for Overcomplete Representation Learning. ICLR’20, oral, top 1.9%.
Preprint – PDF – Slides – BibTex
Qing Qu, Xiao Li, Zhihui Zhu (2019). Exact Recovery of Multichannel Sparse Blind Deconvolution via Gradient Descent. SIAM Journal on Imaging Science, 13(3): 1630–1652, 2020. (NeurIPS’19, spotlight, top 3%).
Preprint – PDF – Code – Poster – Slides – BibTex
Yuqian Zhang, Qing Qu, John Wright (2020). From Symmetry to Geometry: Tractable Nonconvex Problems. In Submission to SIAM Review.
Preprint – PDF – Slides – BibTex
- New paper released on global geometric analysis for neural collapse (May. 2021)
- New paper released on convolutional normalization method for ConvNets (Mar. 2021)
- Online talk at “运筹千里”纵横论坛 organized by Operation Research Society of China (Dec. 2nd, 2020) [Link]
- Co-organizing a workshop on “Seeking Low-dimensionality in Deep Neural Networks (SLowDNN)“, Nov. 23rd – Nov. 24th. (Nov. 2020)
- Elected as a member of the Computational Imaging (CI) TC for 2021-2023. (Nov. 2020)
- Our paper has been accepted at NeurIPS’20 as spotlight (top 4%), which characterizes implicit bias with discrepant learning rates and builds connections between over-parameterization, RPCA, and deep neural networks (Sept. 2020)
- Recognized as one of the best reviewers (top 33%) at ICML’20 (Sept. 2020)
- Our grant proposal (joint with PI Carlos Fernandez-Granda) ‘‘Mathematical Analysis of Super-resolution via Nonconvex Optimization and Machine Learning’’ has been awarded by NSF DMS (Aug. 2020)
- Two review papers on nonconvex optimization are released: Paper 1, Paper 2 (Jul. 2020)
- Given a talk at TBSI-WOLT about my recent work on nonconvex optimization for learning low-complexity models: slides (Jun. 2020)
- Two papers (Paper 1, Paper 2) have been accepted at ICLR’20, with one oral presentation (top 1.85%) (Dec. 2019)
- One paper is released, provides (first) explicit convergence rate guarantees for a family of Riemannian subgradient methods for optimizing nonconvex nonsmooth functions over then Stiefel manifold. (Nov. 2019)
- One paper has been accepted at NeurIPS’19 as spotlight (top 3%) (Sept. 2019)
- Recognized as one of the best reviewers at NeurIPS’19, and invited as a mentor to the first new in ML workshop at NeurIPS’19. (Sept. 2019)