menu MENU
Qing Qu
1301 Beal AvenueAnn Arbor, MI 48109-2122

Qing Qu is an assistant professor in the ECE Division of the Electrical Engineering and Computer Science department of the College of Engineering, University of Michigan – Ann Arbor. He is also affiliated with the Michigan Center for Applied and Interdisciplinary Mathematics (MCAIM). 

He received his B.E. degree from Tsinghua University, Beijing, China, in 2011, and obtained his Ph.D. degree from Columbia University with Prof. John Wright in 2018. He was a Moore-Sloan fellow at NYU Center for Data Science from 2018 to 2020. He is the recipient of Best Student Paper Award at SPARS’15 (with Ju Sun and John Wright), and the recipient of the 2016 Microsoft Ph.D. Fellowship in machine learning.

His research interest lies in the intersection of signal processing, data science, machine learning, and numerical optimization. He is particularly interested in computational methods for learning low-complexity models from high-dimensional data, leveraging tools from machine learning, numerical optimization, and high dimensional geometry, with applications in imaging sciences, scientific discovery, and healthcare. Recently, he is also interested in understanding deep networks through the lens of low-dimensional modeling. You can learn more about his research here.

Here is his Google Scholar Profile. Find him on LinkedIn and follow him on Twitter.

Selected Publications

Zhihui Zhu (equal), Tianyu Ding (equal), Jinxin Zhou, Xiao Li, Chong You, Jeremias Sulam, Qing Qu (2021). A Geometric Analysis of Neural Collapse with Unconstrained Features. (NeurIPS’21, spotlight, top 3%)
Preprint – PDFSlides – BibTex

Qing Qu, Yuexiang Zhai, Xiao Li, Yuqian Zhang, Zhihui Zhu (2020). Analysis of the Optimization Landscapes for Overcomplete Representation LearningICLR’20oral, top 1.9%.
Preprint – PDF – SlidesBibTex

Qing Qu, Xiao Li, Zhihui Zhu (2019). Exact Recovery of Multichannel Sparse Blind Deconvolution via Gradient DescentSIAM Journal on Imaging Science, 13(3): 1630–1652, 2020. (NeurIPS’19spotlight, top 3%).
Preprint – PDF – Code – Poster – SlidesBibTex

Yuqian Zhang, Qing Qu, John Wright (2020). From Symmetry to Geometry: Tractable Nonconvex ProblemsIn Submission to SIAM Review.
Preprint – PDF –  SlidesBibTex

Ju Sun, Qing Qu, John Wright (2018). A Geometric Analysis of Phase RetrievalFoundations of Computational Mathematics, 18(5):1131–1198, 2018. (ISIT’16).
Preprint – PDF – Code – SlidesBibTex

News

  • Three papers have been accepted at NeurIPS’21: paper1, paper2, paper3 (Sept. 2021)
  • New paper released on rank overspecified robust matrix recovery: subgradient method. (Sept. 2021)
  • Given a talk at the SPS Webinar Series [slides] [video], titled “From Shallow to Deep Representation Learning in Imaging and Beyond: Global Nonconvex Theory and Algorithms” (Sept 2021)
  • Organizer of Workshop on Seeking Low-dimensionality in Deep Neural Networks (SLowDNN), November 22-23, 2021 (Stay tuned)
  • Gave a talk in the Workshop on Efficient Tensor Representations for Learning and Computational Complexity at IPAM UCLA, titled “Landscape Analysis for Overcomplete Tensor and Neural Collapse” [sildes] [video] (May 2021)
  • New paper released on global geometric analysis for neural collapse (May. 2021)
  • New paper released on convolutional normalization method for ConvNets (Mar. 2021)
  • Online talk at “运筹千里”纵横论坛 organized by Operation Research Society of China (Dec. 2nd, 2020) [Link]
  • Co-organizing a workshop on “Seeking Low-dimensionality in Deep Neural Networks (SLowDNN)“, Nov. 23rd – Nov. 24th. (Nov. 2020)
  • Elected as a member of the Computational Imaging (CI) TC for 2021-2023. (Nov. 2020)
  • Our paper has been accepted at NeurIPS’20 as spotlight (top 4%), which characterizes implicit bias with discrepant learning rates and builds connections between over-parameterization, RPCA, and deep neural networks (Sept. 2020)
  • Recognized as one of the best reviewers (top 33%) at ICML’20 (Sept. 2020)
  • Our grant proposal (joint with PI Carlos Fernandez-Granda) ‘‘Mathematical Analysis of Super-resolution via Nonconvex Optimization and Machine Learning’’ has been awarded by NSF DMS (Aug. 2020)
  • Two review papers on nonconvex optimization are released: Paper 1, Paper 2 (Jul. 2020)
  • Given a talk at TBSI-WOLT about my recent work on nonconvex optimization for learning low-complexity models: slides (Jun. 2020)
  • Two papers (Paper 1, Paper 2) have been accepted at ICLR’20, with one oral presentation (top 1.85%) (Dec. 2019)
  • One paper is released, provides (first) explicit convergence rate guarantees for a family of Riemannian subgradient methods for optimizing nonconvex nonsmooth functions over then Stiefel manifold. (Nov. 2019)
  • One paper has been accepted at NeurIPS’19 as spotlight (top 3%) (Sept. 2019)
  • Recognized as one of the best reviewers at NeurIPS’19, and invited as a mentor to the first new in ML workshop at NeurIPS’19. (Sept. 2019)