Improved complexities of conditional gradient-type methods with applications to robust matrix recovery problems

Dan Garber, Atara Kaplan, Shoham Sabach

Research output: Contribution to journalArticlepeer-review

Abstract

Motivated by robust matrix recovery problems such as Robust Principal Component Analysis, we consider a general optimization problem of minimizing a smooth and strongly convex loss function applied to the sum of two blocks of variables, where each block of variables is constrained or regularized individually. We study a Conditional Gradient-Type method which is able to leverage the special structure of the problem to obtain faster convergence rates than those attainable via standard methods, under a variety of assumptions. In particular, our method is appealing for matrix problems in which one of the blocks corresponds to a low-rank matrix since it avoids prohibitive full-rank singular value decompositions required by most standard methods. While our initial motivation comes from problems which originated in statistics, our analysis does not impose any statistical assumptions on the data.

Original languageEnglish
Article number1
Pages (from-to)185-208
Number of pages24
JournalMathematical Programming
Volume186
Issue number1-2
DOIs
StatePublished - Mar 2021

Keywords

  • Conditional gradient method
  • Convex optimization
  • Frank–Wolfe algorithm
  • Low-rank matrix recovery
  • Low-rank optimization
  • Nuclear norm minimization
  • Robust PCA
  • Semidefinite programming

All Science Journal Classification (ASJC) codes

  • Software
  • General Mathematics

Cite this