Accelerated mini-batch stochastic dual coordinate ascent

Shai Shalev-Shwartz, Tong Zhang

Research output: Contribution to journalConference articlepeer-review

Abstract

Stochastic dual coordinate ascent (SDCA) is an effective technique for solving regularized loss minimization problems in machine learning. This paper considers an extension of SDCA under the mini-batch setting that is often used in practice. Our main contribution is to introduce an accelerated mini-batch version of SDCA and prove a fast convergence rate for this method. We discuss an implementation of our method over a parallel computing system, and compare the results to both the vanilla stochastic dual coordinate ascent and to the accelerated deterministic gradient descent method of Nesterov [2007].

Original languageEnglish (US)
JournalAdvances in Neural Information Processing Systems
StatePublished - 2013
Externally publishedYes
Event27th Annual Conference on Neural Information Processing Systems, NIPS 2013 - Lake Tahoe, NV, United States
Duration: Dec 5 2013Dec 10 2013

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Accelerated mini-batch stochastic dual coordinate ascent'. Together they form a unique fingerprint.

Cite this