Abstract
We introduce a new algorithmic framework for solving nonconvex optimization problems, that is called nested alternating minimization, which aims at combining the classical alternating minimization technique with inner iterations of any optimization method. We provide a global convergence analysis of the new algorithmic framework to critical points of the problem at hand, which to the best of our knowledge, is the first of this kind for nested methods in the nonconvex setting. Central to our global convergence analysis is a new extension of classical proof techniques in the nonconvex setting that allows for errors in the conditions. The power of our framework is illustrated with some numerical experiments that show the superiority of this algorithmic framework over existing methods.
Original language | American English |
---|---|
Pages (from-to) | 53-77 |
Number of pages | 25 |
Journal | Mathematics of Operations Research |
Volume | 48 |
Issue number | 1 |
DOIs | |
State | Published - 2022 |
Keywords
- global convergence
- nested algorithms
- nonconvex and nonsmooth minimization
- nondescent methods
- nonsmooth Kurdyka-Łojasiewicz property
All Science Journal Classification (ASJC) codes
- Computer Science Applications
- General Mathematics
- Management Science and Operations Research