A Faster Interior-Point Method for Sum-of-Squares Optimization

Shunhua Jiang, Bento Natura, Omri Weinstein

Research output: Contribution to journalArticlepeer-review

Abstract

We present a faster interior-point method for optimizing sum-of-squares (SOS) polynomials, which are a central tool in polynomial optimization and capture convex programming in the Lasserre hierarchy. Let p=∑iqi2 be an n-variate SOS polynomial of degree 2d. Denoting by L:=(n+dd) and U:=(n+2d2d) the dimensions of the vector spaces in which qi’s and p live respectively, our algorithm runs in time O~ (LU1.87). This is polynomially faster than state-of-art SOS and semidefinite programming solvers, which achieve runtime O~ (L0.5min { U2.37, L4.24}). The centerpiece of our algorithm is a dynamic data structure for maintaining the inverse of the Hessian of the SOS barrier function under the polynomial interpolant basis, which efficiently extends to multivariate SOS optimization, and requires maintaining spectral approximations to low-rank perturbations of elementwise (Hadamard) products. This is the main challenge and departure from recent IPM breakthroughs using inverse-maintenance, where low-rank updates to the slack matrix readily imply the same for the Hessian matrix.

Original languageAmerican English
Pages (from-to)2843-2884
Number of pages42
JournalAlgorithmica
Volume85
Issue number9
StatePublished - Sep 2023

Keywords

  • Convex optimization
  • Dynamic matrix inverse
  • Interior point methods
  • Sum-of-squares optimization

All Science Journal Classification (ASJC) codes

  • General Computer Science
  • Applied Mathematics
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'A Faster Interior-Point Method for Sum-of-Squares Optimization'. Together they form a unique fingerprint.

Cite this