A Universal Algorithm for Variational Inequalities Adaptive to Smoothness and Noise

Francis Bach, Kfir Y. Levy

Research output: Contribution to journalConference articlepeer-review

Abstract

We consider variational inequalities coming from monotone operators, a setting that includes convex minimization and convex-concave saddle-point problems. We assume an access to potentially noisy unbiased values of the monotone operators and assess convergence through a compatible gap function which corresponds to the standard optimality criteria in the aforementioned subcases. We present a universal algorithm for these inequalities based on the Mirror-Prox algorithm. Concretely, our algorithm simultaneously achieves the optimal rates for the smooth/non-smooth, and noisy/noiseless settings. This is done without any prior knowledge of these properties, and in the general set-up of arbitrary norms and compatible Bregman divergences. For convex minimization and convex-concave saddle-point problems, this leads to new adaptive algorithms. Our method relies on a novel yet simple adaptive choice of the step-size, which can be seen as the appropriate extension of AdaGrad to handle constrained problems.

Original languageEnglish
Pages (from-to)164-194
Number of pages31
JournalProceedings of Machine Learning Research
Volume99
StatePublished - 2019
Externally publishedYes
Event32nd Conference on Learning Theory, COLT 2019 - Phoenix, United States
Duration: 25 Jun 201928 Jun 2019
https://proceedings.mlr.press/v99

Keywords

  • Online learning
  • convex optimization
  • first-order methods
  • minimax games
  • universal methods

All Science Journal Classification (ASJC) codes

  • Software
  • Artificial Intelligence
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'A Universal Algorithm for Variational Inequalities Adaptive to Smoothness and Noise'. Together they form a unique fingerprint.

Cite this