News
Qing Qu Receives 2025 Google Research Scholar Award for Developing Efficient Generative AI Models
Prof. Qu’s research aims to reduce the cost and complexity of fine-tuning and training large-scale foundation models, while advancing the accessibility and sustainability of high-performance generative AI.AI symposium: Michigan Engineering speakers share how they use AI in research
Prof. Qing Qu was an invited speaker at the Michigan Institute for Data & AI in Society’s (MIDAS) symposium Mar 18-19, 2025. He shared methods to prevent generative AI models from creating images with nudity and the faces of real people, which could be combined into harmful deep fakes.Siyi Chen awarded Predoctoral Fellowship to support research on Controllable and Interpretable AI models
Chen, Ph.D. student in Electrical and Computer Engineering, is working to improve the usability, safety, and reliability of AI systems in real-world applications.Can Yaras awarded Predoctoral Fellowship to support research on deep learning and generative AI models
Yaras, Ph.D. student in Electrical and Computer Engineering, is working to better understand the inner workings of deep learning and provide better access to foundation models in AI.Qing Qu receives U-M Chinese Heritage & Scholarship Junior Faculty Award
Prof. Qu joins the inaugural cohort of awardees recognized as emerging leaders in teaching, research, and service at U-M.Fifteen papers by ECE researchers to be presented at the Conference on Neural Information Processing Systems
Topics of accepted ECE NeurIPS papers include diffusion models, large language models, multi-armed bandit models, and more.ECE faculty design chips for efficient and accessible AI
Faculty specializing in architecture, hardware, and software innovation accelerate machine learning across a range of applications.Can Yaras recognized for his research aimed at efficient algorithms for LLMs
Doctoral student Can Yaras wants to reduce the carbon footprint of AI.Fourteen papers by ECE researchers to be presented at the International Conference on Machine Learning
Accepted papers for the ICML conference span topics including deep representation learning, language model fine-tuning, generative modeling, and more.GenAI diffusion models learn to generate new content more consistently than expected
Award-winning research led by Prof. Qing Qu discovered an intriguing phenomenon that diffusion models consistently produce nearly identical content starting from the same noise input, regardless of model architectures or training procedures.News 2023
Improving generative AI models for real-world medical imaging
Professors Liyue Shen, Qing Qu, and Jeff Fessler are working to develop efficient diffusion models for a variety of practical scientific and medical applications.Neural Collapse research seeks to advance mathematical understanding of deep learning
Led by Prof. Qing Qu, the project could influence the application of deep learning in areas such as machine learning, optimization, signal and image processing, and computer vision.News
Qing Qu receives Amazon Research Award
Qu’s research project in the area of machine learning algorithms and theory is called “Principles of deep representation learning via neural collapse.” Awardees, who represent 54 universities in 14 countries, have access to Amazon public datasets, along with AWS AI/ML services and tools.Miniature and durable spectrometer for wearable applications
A team led by P.C. Ku and Qing Qu have developed a miniature, paper-thin spectrometer measuring 0.16mm2 that can also withstand harsh environments.Teaching Machine Learning in ECE
With new courses at the UG and graduate level, ECE is delivering state-of-the-art instruction in machine learning for students in ECE, and across the UniversityQing Qu receives CAREER award to explore the foundations of machine learning and data science
His research develops computational methods for learning succinct representations from high-dimensional data.Qing Qu uses data and machine learning to optimize the world
I have been recognized as one of the best reviewers at NeurIPS’19, and invited as a mentor to the first new in ML workshop at NeurIPS
One paper has been accepted at NeurIPS’19 as spotlight (top 3%)
Our paper, titled A Nonconvex Approach for Exact and Efficient Multichannel Sparse Blind Deconvolution, has been accepted at NeurIPS’19 as spotlight (top 3%). This is a joint work with Xiao Li and Zhihui Zhu.Two papers have been accepted at ICLR’20, with one oral presentation (top 1.85%)
Our paper, titled Geometric Analysis of Nonconvex Optimization Landscapes for Overcomplete Learning has been accepted at ICLR’20 as oral (top 1.85%). Our paper, titled Short-and-Sparse Deconvolution – A Geometric Approach has been accepted at ICLR’20 as poster (acceptance rate 26.5%).A new review paper has been submitted to IEEE Signal Processing Magazine
The paper, titled Finding the Sparsest Vectors in a Subspace: Theory, Algorithms, and Applications, reviews recent work on nonconvex optimization methods for finding the sparsest vectors in linear subspaces. This is a joint work with Zhihui Zhu, Xiao Li, Manolis Tsakiris, John Wright and Rene Vidal.Invited to be a speaker and organizer for Efficient Tensor Representations for Learning and Computational Complexity
The workshop will be held from May 17 – 21, 2021 at the Institute for Pure and Applied Mathematics (IPAM) situated on the UCLA campus. This workshop is part of a semester long program on Tensor Methods and Emerging Applications to the Physical and Data Sciences.New paper submission
The paper, titled Robust Recovery via Implicit Bias of Discrepant Learning Rates for Double Over-parameterization, is submitted. This is a joint work with Chong You, Zhihui Zhu, and Yi Ma.A review paper on nonconvex optimization is released
The paper, titled From Symmetry to Geometry: Tractable Nonconvex Problems, reviews recent advances on nonconvex optimization from a geometric perspective and landscape studies. This is a joint work with Yuqian Zhang and John Wright.