Better mini-batch algorithms via accelerated gradient methods

Andrew Cotter, Ohad Shamir, Nathan Srebro, Karthik Sridharan

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Mini-batch algorithms have been proposed as a way to speed-up stochastic convex optimization problems. We study how such algorithms can be improved using accelerated gradient methods. We provide a novel analysis, which shows how standard gradient methods may sometimes be insufficient to obtain a significant speed-up and propose a novel accelerated gradient algorithm, which deals with this deficiency, enjoys a uniformly superior guarantee and works well in practice.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 24
Subtitle of host publication25th Annual Conference on Neural Information Processing Systems 2011, NIPS 2011
StatePublished - 2011
Externally publishedYes
Event25th Annual Conference on Neural Information Processing Systems 2011, NIPS 2011 - Granada, Spain
Duration: 12 Dec 201114 Dec 2011

Publication series

NameAdvances in Neural Information Processing Systems 24: 25th Annual Conference on Neural Information Processing Systems 2011, NIPS 2011

Conference

Conference25th Annual Conference on Neural Information Processing Systems 2011, NIPS 2011
Country/TerritorySpain
CityGranada
Period12/12/1114/12/11

All Science Journal Classification (ASJC) codes

  • Information Systems

Fingerprint

Dive into the research topics of 'Better mini-batch algorithms via accelerated gradient methods'. Together they form a unique fingerprint.

Cite this