Optimal distributed online prediction

Ofer Dekel, Ran Gilad-Bachrach, Ohad Shamir, Lin Xiao

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Online prediction methods are typically studied as serial algorithms running on a single processor. In this paper, we present the distributed mini-batch (DMB) framework, a method of converting a serial gradient-based online algorithm into a distributed algorithm, and prove an asymptotically optimal regret bound for smooth convex loss functions and stochastic examples. Our analysis explicitly takes into account communication latencies between computing nodes in a network. We also present robust variants, which are resilient to failures and node heterogeneity in an asynchronous distributed environment. Our method can also be used for distributed stochastic optimization, attaining an asymptotically linear speedup. Finally, we empirically demonstrate the merits of our approach on large-scale online prediction problems.

Original languageEnglish
Title of host publicationProceedings of the 28th International Conference on Machine Learning, ICML 2011
Pages713-720
Number of pages8
StatePublished - 2011
Externally publishedYes
Event28th International Conference on Machine Learning, ICML 2011 - Bellevue, WA, United States
Duration: 28 Jun 20112 Jul 2011

Publication series

NameProceedings of the 28th International Conference on Machine Learning, ICML 2011

Conference

Conference28th International Conference on Machine Learning, ICML 2011
Country/TerritoryUnited States
CityBellevue, WA
Period28/06/112/07/11

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • Human-Computer Interaction
  • Education

Fingerprint

Dive into the research topics of 'Optimal distributed online prediction'. Together they form a unique fingerprint.

Cite this