The Min-Max Complexity of Distributed Stochastic Convex Optimization with Intermittent Communication

Blake Woodworth, Brian Bullins, Ohad Shamir, Nathan Srebro

Research output: Contribution to journalConference articlepeer-review

Abstract

We resolve the min-max complexity of distributed stochastic convex optimization (up to a log factor) in the intermittent communication setting, where M machines work in parallel over the course of R rounds of communication to optimize the objective, and during each round of communication, each machine may sequentially compute K stochastic gradient estimates. We present a novel lower bound with a matching upper bound that establishes an optimal algorithm.

Original languageEnglish
Pages (from-to)4386-4437
Number of pages52
JournalProceedings of Machine Learning Research
Volume134
StatePublished - 2021
Event34th Conference on Learning Theory, COLT 2021 - Boulder, United States
Duration: 15 Aug 202119 Aug 2021

Keywords

  • Distributed Stochastic Convex Optimization
  • Oracle Complexity of Optimization

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'The Min-Max Complexity of Distributed Stochastic Convex Optimization with Intermittent Communication'. Together they form a unique fingerprint.

Cite this