Abstract
We consider variational inequalities coming from monotone operators, a setting that includes convex minimization and convex-concave saddle-point problems. We assume an access to potentially noisy unbiased values of the monotone operators and assess convergence through a compatible gap function which corresponds to the standard optimality criteria in the aforementioned subcases. We present a universal algorithm for these inequalities based on the Mirror-Prox algorithm. Concretely, our algorithm simultaneously achieves the optimal rates for the smooth/non-smooth, and noisy/noiseless settings. This is done without any prior knowledge of these properties, and in the general set-up of arbitrary norms and compatible Bregman divergences. For convex minimization and convex-concave saddle-point problems, this leads to new adaptive algorithms. Our method relies on a novel yet simple adaptive choice of the step-size, which can be seen as the appropriate extension of AdaGrad to handle constrained problems.
Original language | English |
---|---|
Pages (from-to) | 164-194 |
Number of pages | 31 |
Journal | Proceedings of Machine Learning Research |
Volume | 99 |
State | Published - 2019 |
Externally published | Yes |
Event | 32nd Conference on Learning Theory, COLT 2019 - Phoenix, United States Duration: 25 Jun 2019 → 28 Jun 2019 https://proceedings.mlr.press/v99 |
Keywords
- Online learning
- convex optimization
- first-order methods
- minimax games
- universal methods
All Science Journal Classification (ASJC) codes
- Software
- Artificial Intelligence
- Control and Systems Engineering
- Statistics and Probability