Abstract
We consider a class of important semidefinite optimization problems that involve a convex smooth or nonsmooth objective function and linear constraints. Focusing on high-dimensional settings with a low-rank solution that also satisfies a low-rank complementarity condition, we prove that the well-known Extragradient method, when initialized with a “warm-start”, converges with its standard convergence rate guarantees, using only efficient low-rank singular value decompositions to project onto the positive semidefinite cone. Supporting numerical evidence with a dataset of Max-Cut instances is provided.
| Original language | English |
|---|---|
| Article number | 107230 |
| Journal | Operations Research Letters |
| Volume | 60 |
| DOIs | |
| State | Published - May 2025 |
Keywords
- Low-rank matrix recovery
- Low-rank optimization
- Nonsmooth semidefinite optimization
- Semidefinite programming
All Science Journal Classification (ASJC) codes
- Software
- Management Science and Operations Research
- Industrial and Manufacturing Engineering
- Applied Mathematics
Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver