CONVEX BI-LEVEL OPTIMIZATION PROBLEMS WITH NONSMOOTH OUTER OBJECTIVE FUNCTION

Roey Merchav, Shoham Sabach

Research output: Contribution to journalArticlepeer-review

Abstract

In this paper, we propose the Bi-Sub-Gradient (Bi-SG) method, which is a generalization of the classical sub-gradient method to the setting of convex bi-level optimization problems. This is a first-order method that is very easy to implement in the sense that it requires only a computation of the associated proximal mapping or a sub-gradient of the outer nonsmooth objective function, in addition to a proximal gradient step on the inner optimization problem. We show, under very mild assumptions, that Bi-SG tackles bi-level optimization problems and achieves sublinear rates in terms of both the inner and outer objective functions. Moreover, if the outer objective function is additionally strongly convex (still could be nonsmooth), the outer rate can be improved to a linear rate. Last, we prove that the distance of the generated sequence to the set of optimal solutions of the bi-level problem converges to zero.

Original languageEnglish
Pages (from-to)3114-3142
Number of pages29
JournalSIAM Journal on Optimization
Volume33
Issue number4
DOIs
StatePublished - 2023

Keywords

  • bi-level optimization
  • convex problems
  • nonsmooth optimization
  • proximal mapping

All Science Journal Classification (ASJC) codes

  • Software
  • Theoretical Computer Science
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'CONVEX BI-LEVEL OPTIMIZATION PROBLEMS WITH NONSMOOTH OUTER OBJECTIVE FUNCTION'. Together they form a unique fingerprint.

Cite this