menu MENU

EECS 559: Optimization Methods for SIPML, Winter 2023

Course Instructor: Prof. Qing Qu

Teaching Assistant: Xiang Li, email: forkobe@umich.edu

Title: Optimization Methods for Signal & Image Processing and Machine Learning (SIPML)

Course Time: Mon/Wed 10:30 AM – 12:00 PM (Hybrid),  Beyster 1670

Office Hour: Wed 1:00 PM – 2:30 PM (In-Person/Remote)

Enrollment based on the ECE override system with priority to SIPML students, a previous course taught by Prof. Jeffrey Fessler can be found here.

Prerequisite: EECS545, EECS 551 or EECS 505 (aka 598 on “Computational Data Science”) is essential

Overview: This graduate-level course introduces optimization methods that are suitable for large-scale problems arising in data science and machine learning applications. We will explore several widely used optimization algorithms for solving convex/nonconvex, and smooth/nonsmooth problems appearing in SIPML. We will study the efficacy of these methods, which include (sub)gradient methods, proximal methods, Nesterov’s accelerated methods, ADMM, quasi-Newton, trust-region, cubic regularization methods, and (some of) their stochastic variants. If time allowed, we will also introduce constraint optimization over the Riemannian manifold. In the meanwhile, we will show how these methods can be applied to concrete problems ranging from inverse problems in signal processing (e.g., sparse recovery, phase retrieval, blind deconvolution, matrix completion), unsupervised learning (e.g., dictionary learning, independent component analysis, nonnegative matrix factorization), to supervised learning (e.g., deep learning).

The course will involve extensive practical algorithm development, implementation, and investigation using MATLAB, Python, or Julia.  Designing methods to be suitable for large-scale SIPML applications will be emphasized and students will be expected to learn and apply efficient coding methods.

Course Materials: slides and video will be accessed via Canvas (TBA), below are some tentative algorithms that will be covered in the course:

  • 1st-order methods for smooth optimization: gradient descent, conjugate gradient, line-search method, momentum (Nesterov’s accelerated) method;
  • 1st-order methods for nonsmooth optimization: subgradient method, proximal method, and its accelerated variants;
  • Large-scale 1st-order optimization: ADMM, Frank-Wolfe method, and stochastic/incremental gradient methods;
  • 2nd-order methods: Newton and quasi-Newton method, trust-region method, cubic regularization method, and curvilinear search method;
  • Riemannian optimization (if time allowed): optimization over matrix manifolds such as the sphere, Stiefel manifold, Grassmannian manifold, etc.

Every optimization method introduced will have at least one SIPML application that we will introduce as motivation. Students will implement and test these methods on those applications.

Assessment: (i) homework assignment (bi-monthly, 40%), (ii) final (take-home) exam (40%), (iii) course project (15%) (iv) course participation (5%)

Textbook: We recommend the following books and articles, although we will not follow them closely.

Related courses:

  • EECS 600 (Function Space Methods for Systems Theory) is much more theoretical than this course because it deals with infinite-dimensional spaces whereas EECS 559 will focus completely on finite-dimensional problems.  EECS 600 is far more proof oriented than this course, but there will be some proofs presented and expected in EECS 559 as well.
  • IOE 410 (Advanced Optimization Methods) focuses on discrete methods and seems aimed at undergraduates.
  • IOE 511/Math562 (Continuous Optimization Methods) has some overlap in terms of the optimization methods.  IOE 511 uses Matlab.  EECS 559 focuses on SIPML applications.
  • IOE 611/Math663 (Nonlinear Programming) covers important Convex Optimization principles. It uses the CVX package in Matlab that does not scale to large problems.  EECS 559 emphasizes large-scale SIPML applications.
  • STAT 608 (Optimization Methods in Statistics) covers many of the same methods as EECS 559. 
  • EECS 556 (Image Processing) introduces some applications (e.g., image deblurring) that are considered as examples in EECS 559.  So there is some overlap with EECS 556, as well as the other courses listed above, but it is fine for students to take this course and also any or all of EECS 556, EECS 600, and IOE 611.

Course Syllabus:

WeekTime TopicContentsHomework, Project & Finals
Week-1-11/02No Class
Week-1-21/04Introduction (Remote)Course Logistics & Overview 
Week-2-11/9Optimization BasicsIntroduction to mathematical optimization 
Week-2-21/11Optimization BasicsSample examples & applications, mathematical background HW1 Release
Week-3-11/16Martin Luther King Day, No Class
Week-3-21/18Convex SmoothGradient descent method, Line search 
Week-4-11/23Convex SmoothGradient descent method, Line search 
Week-4-21/25Convex SmoothNesterov’s acceleration, Newton’s MethodHW 1 Due, HW 2 Release  
Week-5-11/30Convex NonsmoothStochastic Gradient Descent 
Week-5-22/01Convex NonsmoothIntro of nonsmooth problems, subgradient methods 
Week-6-12/06Convex Nonsmoothsubgradient methods II 
Week-6-22/08Convex NonsmoothSmoothing & Moreau envelopeHW 2 Due, HW 3 Release   
Week-7-12/13Convex NonsmoothProximal gradient method 
Week-7-22/15Convex NonsmoothAccelerated proximal gradient & homotopy continuation 
Week-8-12/20Convex NonsmoothAugmented Lagrangian Method 
Week-8-22/22Convex NonsmoothAlternating Direction Method of Multipliers (ADMM) HW 3 Due, HW 4 Release   
Week-9-12/27Spring Break, No Class
Week-9-23/01Spring Break,  No Class
Week-10-13/06Convex OptimizationAlternating Direction Method of Multipliers (ADMM) II 
Week-10-23/08Convex OptimizationFrank-Wolfe MethodsProject Proposal Due, Mar. 10th
Week-11-13/13Nonconvex OptimizationIntroduction of Nonconvex Problems I 
Week-11-23/15Nonconvex OptimizationIntroduction of Nonconvex Problems IIHW 4 Due, HW 5 Release    
Week-12-13/20Nonconvex OptimizationIntroduction of Nonconvex Problems & Trust-Region Method  
Week-12-23/22Nonconvex OptimizationTrust-Region Method I 
Week-13-13/27Nonconvex OptimizationTrust-Region Method II 
Week-13-23/29Nonconvex OptimizationCubic Regularization Method HW 5 Due, HW 6 Release   
Week-14-14/03Riemannian OptimizationRiemannian optimization I 
Week-14-24/05Riemannian OptimizationRiemannian optimization II 
Week-15-14/10Riemannian OptimizationRiemannian optimization III
Week-15-24/12Guest Lecture ITBATBA
Week-16-14/17Guest Lecture IITBA TBA
Week-16-24/19Exam Preparation, No Class