Qing Qu is an assistant professor in the ECE Division of the Electrical Engineering and Computer Science department of the College of Engineering, University of Michigan – Ann Arbor. He is also affiliated with the Michigan Institute for Data Science (MIDAS), the Michigan Center for Applied and Interdisciplinary Mathematics (MCAIM), and the Michigan Institute for Computational Discovery and Engineering (MICDE).
I received my B.E. degree from Tsinghua University, Beijing, China, in 2011, and obtained my Ph.D. degree from Columbia University with Prof. John Wright in 2018. I was a Moore-Sloan fellow at NYU Center for Data Science from 2018 to 2020. My work has been recognized by a couple of awards, including a Microsoft Ph.D. Fellowship in machine learning, a NSF Career Award in 2022, and an Amazon AWS AI Award in 2023.
My research interest lies in the intersection of signal processing, data science, machine learning, and numerical optimization. Broadly speaking, I am interested in computational methods for learning low-complexity models from high-dimensional data, leveraging tools from machine learning, numerical optimization, and high-dimensional geometry, with applications in imaging sciences and scientific discovery. Recently, my major interest has been in understanding deep representation learning and diffusion models, through the lens of low-dimensional modeling.
Here is my Google Scholar Profile. Find me on LinkedIn and Twitter. In my spare time, I am an avid runner.
For prospective students: My group recruits 1-2 PhD students per year, and I am looking for self-motivated students with aligned research interests to work with. To help identify mutual interests, please fill out this form and send me a short email note with your Resume.
News
Recent Highlights:
- Tutorials on nonconvex optimization for deep representation learning (from ACDL’23):
- A new short course on Learning Nonlinear and Deep Low-Dimensional Representations from High-Dimensional Data.
- Program chair of our new Conference on Parsimony and Learning (CPAL).
- I am organizing the ECE Communications and Signal Processing Seminar at UMich every Thursday (Videos). Send me an email if you are interested in giving a talk in person.
Upcoming Events:
- Giving a tutorial on “Understanding Deep Representation Learning via Neural Collapse” at ICASSP’24 with Prof. Laura Balzano and Prof. Zhihui Zhu (Apr. 2024)
- Co-organizer of the Midwest Machine Learning Symposium 2024 (MMLS’24) (May 2024)
- Giving a one-day tutorial on “Learning Deep Low-dimensional Models from High-Dimensional Data: From Theory to Practice” at CVPR 2024. (Jun. 2024)
- Co-organizer of Computational Imaging Workshop at IMSI, UChicago (Aug. 2024)
- Teaching a high-school summer camp “AI Magic” (Aug. 2024)
Recent News:
- Paper Release: Decoupled Data Consistency with Diffusion Purification for Image Restoration is submitted (Mar. 2024)
- Invited Talk at UC Davis Math Seminar (Mar. 2024)
- Contributed Talk at Information Theory and Applications (ITA) Workshop (Feb. 2024)
- Paper Acceptance: our paper Improving Efficiency of Diffusion Models via Multi-Stage Framework and Tailored Multi-Decoder Architectures has been accepted by CVPR (Feb. 2024)
- Paper Release: Analysis of Deep Image Prior and Exploiting Self-Guidance for Image Reconstruction is submitted (Feb. 2024)
- Grant Approval: Received a gift grant from KLA for continued support. (Jan. 2024)
- Paper Acceptance: our paper Efficient Compression of Overparameterized Deep Models through Low-Dimensional Learning Dynamics has been accepted by AISTATS’24 (Jan. 2024)
- Paper Acceptance: our paper Solving Inverse Problems with Latent Diffusion Models via Hard Data Consistency has been accepted by ICLR’24 as a spotlight (Jan. 2024)
- Conference on Parsimony and Learning (CPAL) (Jan. 3rd-6th 2024)
- Invited talk at IMS Young Mathematical Scientist Forum – Applied Math, National University of Singapore (NUS) (Jan. 8th – 11th, 2024)
Archived:
Recent Talks:
SLowDNN talk: Understanding Deep Neural Networks via Neural Collapse
CSP Seminar Talk: Invariant Low-Dimensional Subspaces in Gradient Descent for Learning Deep Networks
Funding Acknowledgement
Our group acknowledge generous support from the National Science Foundation (NSF), Office of Naval Research (ONR), Amazon Research, KLA Corporation, MICDE, and MIDAS