Stochastic primal-dual method for empirical risk minimization with O(1) per-iteration complexity

Conghui Tan, Shiqian Ma, Tong Zhang, Ji Liu

Research output: Contribution to journalConference articlepeer-review

Abstract

Regularized empirical risk minimization problem with linear predictor appears frequently in machine learning. In this paper, we propose a new stochastic primal-dual method to solve this class of problems. Different from existing methods, our proposed methods only require O(1) operations in each iteration. We also develop a variance-reduction variant of the algorithm that converges linearly. Numerical experiments suggest that our methods are faster than existing ones such as proximal SGD, SVRG and SAGA on high-dimensional problems.

Original languageEnglish (US)
Pages (from-to)8366-8375
Number of pages10
JournalAdvances in Neural Information Processing Systems
Volume2018-December
StatePublished - 2018
Externally publishedYes
Event32nd Conference on Neural Information Processing Systems, NeurIPS 2018 - Montreal, Canada
Duration: Dec 2 2018Dec 8 2018

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Stochastic primal-dual method for empirical risk minimization with O(1) per-iteration complexity'. Together they form a unique fingerprint.

Cite this