TY - GEN
T1 - A Faster Interior-Point Method for Sum-Of-Squares Optimization
AU - Jiang, Shunhua
AU - Natura, Bento
AU - Weinstein, Omri
N1 - Publisher Copyright: © Shunhua Jiang, Bento Natura, and Omri Weinstein; licensed under Creative Commons License CC-BY 4.0
PY - 2022/7/1
Y1 - 2022/7/1
N2 - We present a faster interior-point method for optimizing sum-of-squares (SOS) polynomials, which are a central tool in polynomial optimization and capture convex programming in the Lasserre hierarchy. Let p = ∑ i qi2 be an n-variate SOS polynomial of degree 2d. Denoting by (Equation presented) and (Equation presented) the dimensions of the vector spaces in which qi's and p live respectively, our algorithm runs in time Õ(LU1.87). This is polynomially faster than state-of-art SOS and semidefinite programming solvers [16, 15, 27], which achieve runtime Õ(L0.5 min{U2.37, L4.24}). The centerpiece of our algorithm is a dynamic data structure for maintaining the inverse of the Hessian of the SOS barrier function under the polynomial interpolant basis [27], which efficiently extends to multivariate SOS optimization, and requires maintaining spectral approximations to low-rank perturbations of elementwise (Hadamard) products. This is the main challenge and departure from recent IPM breakthroughs using inverse-maintenance, where low-rank updates to the slack matrix readily imply the same for the Hessian matrix.
AB - We present a faster interior-point method for optimizing sum-of-squares (SOS) polynomials, which are a central tool in polynomial optimization and capture convex programming in the Lasserre hierarchy. Let p = ∑ i qi2 be an n-variate SOS polynomial of degree 2d. Denoting by (Equation presented) and (Equation presented) the dimensions of the vector spaces in which qi's and p live respectively, our algorithm runs in time Õ(LU1.87). This is polynomially faster than state-of-art SOS and semidefinite programming solvers [16, 15, 27], which achieve runtime Õ(L0.5 min{U2.37, L4.24}). The centerpiece of our algorithm is a dynamic data structure for maintaining the inverse of the Hessian of the SOS barrier function under the polynomial interpolant basis [27], which efficiently extends to multivariate SOS optimization, and requires maintaining spectral approximations to low-rank perturbations of elementwise (Hadamard) products. This is the main challenge and departure from recent IPM breakthroughs using inverse-maintenance, where low-rank updates to the slack matrix readily imply the same for the Hessian matrix.
KW - Dynamic Matrix Inverse
KW - Interior Point Methods
KW - Sum-of-squares Optimization
UR - http://www.scopus.com/inward/record.url?scp=85133487686&partnerID=8YFLogxK
U2 - https://doi.org/10.4230/LIPIcs.ICALP.2022.79
DO - https://doi.org/10.4230/LIPIcs.ICALP.2022.79
M3 - منشور من مؤتمر
T3 - Leibniz International Proceedings in Informatics, LIPIcs
SP - 79:1-79:20
BT - 49th EATCS International Conference on Automata, Languages, and Programming, ICALP 2022
A2 - Bojanczyk, Mikolaj
A2 - Merelli, Emanuela
A2 - Woodruff, David P.
PB - Schloss Dagstuhl- Leibniz-Zentrum fur Informatik GmbH, Dagstuhl Publishing
T2 - 49th EATCS International Conference on Automata, Languages, and Programming, ICALP 2022
Y2 - 4 July 2022 through 8 July 2022
ER -