Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization

Shai Shnlev-Shwartz, Tong Zhang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

2014 We introduce a proximal version of the stochastic dual coordinate ascent method and show how to accelerate the method using an inner-outer iteration procedure. We analyze the runtime of the framework and obtain rates that improve state- of-the-art results for various key machine learning optimization problems including SVM, logistic regression, ridge regression. Lasso. and multi- class SVM. Experiments validate our theoretical findings.

Original languageEnglish (US)
Title of host publication31st International Conference on Machine Learning, ICML 2014
PublisherInternational Machine Learning Society (IMLS)
Pages111-119
Number of pages9
ISBN (Electronic)9781634393973
StatePublished - 2014
Externally publishedYes
Event31st International Conference on Machine Learning, ICML 2014 - Beijing, China
Duration: Jun 21 2014Jun 26 2014

Publication series

Name31st International Conference on Machine Learning, ICML 2014
Volume1

Other

Other31st International Conference on Machine Learning, ICML 2014
Country/TerritoryChina
CityBeijing
Period6/21/146/26/14

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Networks and Communications
  • Software

Fingerprint

Dive into the research topics of 'Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization'. Together they form a unique fingerprint.

Cite this