Abstract
The projection operation is a crucial step in applying Online Gradient Descent (OGD) and its stochastic version SGD. Unfortunately, in some cases, projection is computationally demanding and inhibits us from applying OGD. In this work, we focus on the special case where the constraint set is smooth and we have an access to gradient and value oracles of the constraint function. Under these assumptions we design a new approximate projection operation that necessitates only logarithmically many calls to these oracles. We further show that combining OGD with this new approximate projection, results in a projection-free variant that recovers the standard rates of the fully projected version. This applies to both convex and strongly-convex online settings.
| Original language | English |
|---|---|
| Pages (from-to) | 1458-1466 |
| Journal | Proceedings of Machine Learning Research |
| Volume | 89 |
| State | Published - 2020 |
| Externally published | Yes |
| Event | 22nd International Conference on Artificial Intelligence and Statistics, AISTATS 2019 - Naha, Japan Duration: 16 Apr 2019 → 18 Apr 2019 |
All Science Journal Classification (ASJC) codes
- Artificial Intelligence
- Statistics and Probability
Fingerprint
Dive into the research topics of 'Projection free online learning over smooth sets'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver