On the convergence of projected-gradient methods with low-rank projections for smooth convex minimization over trace-norm balls and related problems

Research output: Contribution to journalArticlepeer-review

Abstract

Smooth convex minimization over the unit trace-norm ball is an important optimization problem in machine learning, signal processing, statistics, and other fields that underlies many tasks in which one wishes to recover a low-rank matrix given certain measurements. While first-order methods for convex optimization enjoy optimal convergence rates, they require in the worst-case to compute a full-rank SVD on each iteration, in order to compute the Euclidean projection onto the trace-norm ball. These full-rank SVD computations, however, prohibit the application of such methods to large-scale problems. A simple and natural heuristic to reduce the computational cost of such methods is to approximate the Euclidean projection using only a low-rank SVD. This raises the question if, and under what conditions, this simple heuristic can indeed result in provable convergence to the optimal solution. In this paper we show that any optimal solution is a center of a Euclidean ball inside which the projected-gradient mapping admits a rank that is at most the multiplicity of the largest singular value of the gradient vector at this optimal point. Moreover, the radius of the ball scales with the spectral gap of this gradient vector. We show how this readily implies the local convergence (i.e., from a "warm-start"" initialization) of standard first-order methods such as the projected-gradient method and accelerated gradient methods, using only low-rank SVD computations. We also quantify the effect of "over-parameterization,"" i.e., using SVD computations with higher rank, on the radius of this ball, showing it can increase dramatically with moderately larger rank. We extend our results also to the setting of smooth convex minimization with trace-norm regularization and smooth convex optimization over bounded-trace positive semidefinite matrices. Our theoretical investigation is supported by concrete empirical evidence that demonstrates the correct convergence of first-order methods with low-rank projections for the matrix completion task on real-world datasets.

Original languageEnglish
Article number1
Pages (from-to)727-753
Number of pages27
JournalSIAM Journal on Optimization
Volume31
Issue number1
DOIs
StatePublished - Feb 2021

Keywords

  • First-order methods
  • Large scale matrix optimization
  • Low-rank matrix optimization
  • Matrix completion
  • Nuclear norm optimization
  • Semidefinite optimization
  • Trace norm optimization

All Science Journal Classification (ASJC) codes

  • Software
  • Theoretical Computer Science

Fingerprint

Dive into the research topics of 'On the convergence of projected-gradient methods with low-rank projections for smooth convex minimization over trace-norm balls and related problems'. Together they form a unique fingerprint.

Cite this