Course Calendar

Lec 01 (Introduction)

  • Reading: Chap. 1.
Introduction I
8/26

(slides)

Lec 02 (Sparse Model Overview)

  • Reading: Chap. 1, Sec. 2.1, and 2.2.
Introduction II
8/31

(slides)

Lec 03

  • Reading: Sec. 2.2 and 2.3.
Recovering a Sparse Solution & L1 Norm Relaxation
9/2

(slides)

Discussion 01

  • Reading: Chap. 1, 2, and Appendix A.
Demos of L0 norm and L1 norm recovery, review of Linear Algebra and statistics

Lec 04

  • Reading: Sec. 3.1 and 3.2.
Relaxing the Sparse Recovery Problem
9/7

(slides)

Lec 05

  • Reading: Sec. 3.3.
Convex Methods for Sparse Signal Recovery
9/9

(slides)

Lec 06

  • Reading: Sec. 3.4.
Matrices with Restricted Isometry Property
9/14

(slides)

Lec 07

  • Reading: Sec. 3.4.
Matrices with Restricted Isometry Property (Continued from Lec 06)
9/16

(slides)

Discussion 02

  • Reading: Appendix E of High-Dim Data Analysis, and Chapter 2 of High-Dim Stat by Professor Wainwright.
A Brief Introduction to High-Dimensional Statistics

Lec 08

  • Reading: Sec. 3.5.
Matrices with Restricted Isometry Property (Noisy Observations or Approximated Sparsity)
9/21

(slides)

Lec 09

  • Reading: Sec. 3.6, 3.7, and Sec. 6.2 (optional).
Convex Methods for Sparse Signal Recovery (Phase Transition in Sparse Recovery)
9/23

(slides)

Lec 10

  • Reading: Sec. 4.1 - 4.3.
Convex Methods for Low-Rank Matrix Recovery (Random Measurements)
9/28

(slides)

Lec 11

  • Reading: Sec. 4.4 - 4.6.
Convex Methods for Low-Rank Matrix Recovery (Matrix Completion)
9/30

(slides)

Discussion 03

  • Reading: Appendix A.9.
Matrix Inequalities and Project Preparation

Lec 12

  • Reading: Sec. 5.1 - 5.3.
Decomposing Low-Rank and Sparse Matrices (Principal Component Pursuit)
10/5

(slides)

Lec 13

  • Reading: Sec. 5.1 - 5.3.
Proof of Robust Principal Component Analysis (RPCA)
10/7

(jamboard link)

Lec 14

  • Reading: Sec. 8.1 - 8.3, Appendix B, C, and D.
Unconstrained Convex Optimization for Structured Data Recovery
10/12

(slides1,slides2)

Lec 15

  • Reading: Sec. 8.4 - 8.6.
Constrained Convex Optimizationfor Structured Data Recovery
10/14

(slides)

Discussion 04

Low Rank Matrix Recovery and RPCA
10/15

Professor Yuxin Chen will talk about his recent work on low rank matrix recovery and RPCA (slides).

Lec 15

Project Proposal and Presentation
10/19

Lec 16

  • Reading: Sec. 7.1 - 7.3.
Nonconvex Methods for Low-Dimensional Models Dictionary Learning
10/21

(slides)

Discussion 05

Non Convex Optimization and Dictionary Learning
10/22

Professor Qing Qu will talk about his recent work on non convex optimization and dictionary learning (slides).

Lec 17

Dictionary Learning via l4 Maximization
10/26

(slides)

Lec 18

  • Reading: Sec. 9.1 - 9.5.
Nonconvex Optimization for High-Dim Problems First Order Methods
10/28

(slides)

Lec 19

  • Reading: Sec. 9.6.
Nonconvex Optimization for High-Dim Problems Fixed Point Power Iteration
11/2

(slides)

Lec 20

  • Reading: Sec. 7.3.3 and Chap. 12.
Structured Nonlinear Low-Dimensional Models Sparsity in Convolution and Deconvolution
11/4

(slides)

Discussion 06

  • Reading: TBD.
Non Convex Optimization and Blind Deconvolution
11/5

Professor (Yuqian Zhang) will talk about her recent work on blind deconvolution.

Lec 21

  • Reading: Chap. 15.
Structured Nonlinear Low-Dimensional Models Transform Invariant/Equivariant Low-Rank Texture
11/9

(slides)

Lec 22

Structured Nonlinear Low-Dimensional Models Transform Invariant/Equivariant Low-Rank Texture I
11/16

(slides)

Lec 23

Structured Nonlinear Low-Dimensional Models Transform Invariant/Equivariant Low-Rank Texture II
11/18

(slides)

Lec 24

SlowDNN workshop
11/23

(link)

Lec 25

  • Reading: LDR.
Closed-Loop Data Transcription to an LDR via Minimaxing Rate Reduction
11/30

(slides)

Lec 26

Deep Networks and the MultipleĀ ManifoldĀ Problem (Guest Lecture, Professor John Wright)
12/2

(slides)

Lec 27

  • Reading: tbd.
The Hidden Convex Optimization Landscape of Deep Neural Networks (Guest Lecture, Mert Pilanci)
12/3

(slides)