Skip to main navigation Skip to search Skip to main content

Low-rank extragradient methods for scalable semidefinite optimization

Dan Garber, Atara Kaplan

Research output: Contribution to journalArticlepeer-review

Abstract

We consider a class of important semidefinite optimization problems that involve a convex smooth or nonsmooth objective function and linear constraints. Focusing on high-dimensional settings with a low-rank solution that also satisfies a low-rank complementarity condition, we prove that the well-known Extragradient method, when initialized with a “warm-start”, converges with its standard convergence rate guarantees, using only efficient low-rank singular value decompositions to project onto the positive semidefinite cone. Supporting numerical evidence with a dataset of Max-Cut instances is provided.

Original languageEnglish
Article number107230
JournalOperations Research Letters
Volume60
DOIs
StatePublished - May 2025

Keywords

  • Low-rank matrix recovery
  • Low-rank optimization
  • Nonsmooth semidefinite optimization
  • Semidefinite programming

All Science Journal Classification (ASJC) codes

  • Software
  • Management Science and Operations Research
  • Industrial and Manufacturing Engineering
  • Applied Mathematics

Cite this