Projection free online learning over smooth sets

Kfir Y. Levy, Andreas Krause

Research output: Contribution to journalConference articlepeer-review

Abstract

The projection operation is a crucial step in applying Online Gradient Descent (OGD) and its stochastic version SGD. Unfortunately, in some cases, projection is computationally demanding and inhibits us from applying OGD. In this work, we focus on the special case where the constraint set is smooth and we have an access to gradient and value oracles of the constraint function. Under these assumptions we design a new approximate projection operation that necessitates only logarithmically many calls to these oracles. We further show that combining OGD with this new approximate projection, results in a projection-free variant that recovers the standard rates of the fully projected version. This applies to both convex and strongly-convex online settings.

Original languageEnglish
Pages (from-to)1458-1466
JournalProceedings of Machine Learning Research
Volume89
StatePublished - 2020
Externally publishedYes
Event22nd International Conference on Artificial Intelligence and Statistics, AISTATS 2019 - Naha, Japan
Duration: 16 Apr 201918 Apr 2019

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Projection free online learning over smooth sets'. Together they form a unique fingerprint.

Cite this