Abstract
We show that a simple randomized sketch of the matrix multiplicative weight (MMW) update enjoys (in expectation) the same regret bounds as MMW, up to a small constant factor. Unlike MMW, where every step requires full matrix exponentiation, our steps require only a single product of the form $e^A b$, which the Lanczos method approximates efficiently. Our key technique is to view the sketch as a randomized mirror projection, and perform mirror descent analysis on the expected projection. Our sketch solves the online eigenvector problem, improving the best known complexity bounds by $ n)$. We also apply this sketch to semidefinite programming in saddle-point form, yielding a simple primal-dual scheme with guarantees matching the best in the literature.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of the Thirty-Second Conference on Learning Theory (COLT) |
| Editors | Alina Beygelzimer, Daniel Hsu |
| Place of Publication | Phoenix, USA |
| Pages | 589-623 |
| Number of pages | 35 |
| Volume | 99 |
| State | Published - 1 Jan 2019 |
| Externally published | Yes |
| Event | 32nd Annual Conference on Learning Theory, COLT 2019 - Phoenix, United States Duration: 25 Jun 2019 → 28 Jun 2019 Conference number: 32 |
Publication series
| Name | Proceedings of Machine Learning Research |
|---|---|
| Publisher | PMLR |
Conference
| Conference | 32nd Annual Conference on Learning Theory, COLT 2019 |
|---|---|
| Abbreviated title | COLT 2019 |
| Country/Territory | United States |
| City | Phoenix |
| Period | 25/06/19 → 28/06/19 |