menu MENU

EECS 559: Optimization Methods for SIPML, Winter 2022

Course Instructor: Prof. Qing Qu

Teaching Assistant: Haonan Zhu, email: haonan@umich.edu

Title: Optimization Methods for Signal & Image Processing and Machine Learning (SIPML)

Course Time: Mon/Wed 10:30 AM – 12:00 PM (In-Person/Remote),  4 credit hour

Office Hour: Wed 1:00 PM – 2:30 PM (In-Person/Remote)

Enrollment based on ECE override system with priority to SIPML students, a previous course taught by Prof. Jeffrey Fessler can be found here.

Prerequisite: EECS545, EECS 551 or EECS 505 (aka 598 on “Computational Data Science”) is essential

Overview: This graduate-level course introduces optimization methods that are suitable for large-scale problems arising in data science and machine learning applications. We will explore several widely used optimization algorithms for solving convex/nonconvex, and smooth/nonsmooth problems appearing in SIPML. We will study the efficacy of these methods, which include (sub)gradient methods, proximal methods, Nesterov’s accelerated methods, ADMM, quasi-Newton, trust-region, cubic regularization methods, and (some of) their stochastic variants. If time allowed, we will also introduce constraint optimization over Riemannian manifold. In the meanwhile, we will show how these methods can be applied to concrete problems ranging from inverse problems in signal processing (e.g., sparse recovery, phase retrieval, blind deconvolution, matrix completion), unsupervised learning (e.g., dictionary learning, independent component analysis, nonnegative matrix factorization), to supervised learning (e.g., deep learning).

The course will involve extensive practical algorithm development, implementation, and investigation using MATLAB, Python, or Julia.  Designing methods to be suitable for large-scale SIPML applications will be emphasized and students will be expected to learn and apply efficient coding methods.

Course Materials: slides and video will be accessed via Canvas (TBA), below are some tentative algorithms will be covered in the course:

  • 1st-order methods for smooth optimization: gradient descent, conjugate gradient, line-search method, momentum (Nesterov’s accelerated) method;
  • 1st-order methods for nonsmooth optimization: subgradient method, proximal method, and its accelerated variants;
  • Large-scale 1st-order optimization: ADMM, Frank-Wolfe method, and stochastic/incremental gradient methods;
  • 2nd-order methods: Newton and quasi-Newton method, trust-region method, cubic regularization method, and curvilinear search method;
  • Riemannian optimization (if time allowed): optimization over matrix manifolds such as the sphere, Stiefel manifold, Grassmannian manifold, etc.

Every optimization method introduced will have at least one SIPML application that we will introduce as motivations. Students will implement and test these methods on those applications.

Assessment: (i) homework assignment (bi-monthly, 40%), (ii) final (take-home) exam (40%), (iii) course project (20%)

Textbook: We recommend the following books and articles, although we will not follow them closely.

Related courses:

  • EECS 600 (Function Space Methods for Systems Theory) is much more theoretical than this course because it deals with infinite-dimensional spaces whereas EECS 559 will focus completely on finite-dimensional problems.  EECS 600 is far more proof oriented than this course, but there will be some proofs presented and expected in EECS 559 as well.
  • IOE 410 (Advanced Optimization Methods) focuses on discrete methods and seems aimed at undergraduates.
  • IOE 511/Math562 (Continuous Optimization Methods) has some overlap in terms of the optimization methods.  IOE 511 uses Matlab.  EECS 559 focuses on SIPML applications.
  • IOE 611/Math663 (Nonlinear Programming) covers important Convex Optimization principles. It uses the CVX package in Matlab that does not scale to large problems.  EECS 559 emphasizes large-scale SIPML applications.
  • STAT 608 (Optimization Methods in Statistics) covers many of the same methods as EECS 559. 
  • EECS 556 (Image Processing) introduces some applications (e.g., image deblurring) that are considered as examples in EECS 559.  So there is some overlap with EECS 556, as well as the other courses listed above, but it is fine for students to take this course and also any or all of EECS 556, EECS 600, and IOE 611.

Course Syllabus:

WeekTime TopicContentsHomework, Project & Finals
Week-1-11/03No Class
Week-1-21/05IntroductionCourse Logistics & Overview 
Week-2-11/10Optimization BasicsIntroduction to mathematical optimization 
Week-2-21/12Optimization BasicsSample examples & applications, mathematical background HW1 Release
Week-3-11/17Martin Luther King Day, No Class
Week-3-21/19Convex SmoothGradient descent method, Line search 
Week-4-11/24Convex SmoothNesterov’s acceleration, Newton’s Method 
Week-4-21/26Convex SmoothStochastic Gradient DescentHW 1 Due, HW 2 Release  
Week-5-11/31Convex NonsmoothIntro of nonsmooth problems, subgradient methods 
Week-5-22/02Convex NonsmoothSmoothing & Moreau envelope 
Week-6-12/07Convex NonsmoothProximal gradient method 
Week-6-22/09Convex NonsmoothAccelerated proximal gradient & homotopy continuationHW 2 Due, HW 3 Release   
Week-7-12/14Convex NonsmoothAugmented Lagrangian Method 
Week-7-22/16Convex NonsmoothAlternating Direction Method of Multipliers (ADMM) I 
Week-8-12/21Convex NonsmoothAlternating Direction Method of Multipliers (ADMM) II 
Week-8-22/23Convex NonsmoothFrank-Wolfe MethodsHW 3 Due, HW 4 Release   
Week-9-12/28Spring Break, No Class
Week-9-23/02Spring Break,  No Class
Week-10-13/07Nonconvex OptimizationIntroduction of Nonconvex problemsProject Proposal Due 
Week-10-23/09Nonconvex OptimizationTrust-Region Method I 
Week-11-13/14Nonconvex OptimizationTrust-Region Method II 
Week-11-23/16Nonconvex OptimizationCubic Regularization Method HW 4 Due, HW 5 Release    
Week-12-13/21Nonconvex OptimizationCurvilinear Search Methods 
Week-12-23/23Nonconvex OptimizationNoisy gradient descent 
Week-13-13/28Nonconvex OptimizationNoisy gradient descent 
Week-13-23/30Nonconvex OptimizationNonsmooth nonconvex, subgradient methodHW 5 Due, HW 6 Release   
Week-14-14/04Riemannian OptimizationRiemannian optimization basics 
Week-14-24/06Riemannian OptimizationRiemannian gradient descent method 
Week-15-14/11Riemannian OptimizationRiemannian trust-region methods
Week-15-24/13Guest Lecture IOptimization for Deep Learning (TBA)HW 6 Due
Week-16-14/18Guest Lecture IIOptimization for Deep Learning (TBA)
Week-16-14/20Exam Preparation, No Class