Spider: Near-optimal non-convex optimization via stochastic path integrated differential estimator

Cong Fang, Chris Junchi Li, Zhouchen Lin, Tong Zhang

Research output: Contribution to journalConference articlepeer-review

Abstract

In this paper, we propose a new technique named Stochastic Path-Integrated Differential EstimatoR (SPIDER), which can be used to track many deterministic quantities of interests with significantly reduced computational cost. Combining SPIDER with the method of normalized gradient descent, we propose SPIDER-SFO that solve non-convex stochastic optimization problems using stochastic gradients only. We provide a few error-bound results on its convergence rates. Specially, we prove that the SPIDER-SFO algorithm achieves a gradient computation cost of O min(n1/2−23) to find an -approximate first-order stationary point. In addition, we prove that SPIDER-SFO nearly matches the algorithmic lower bound for finding stationary point under the gradient Lipschitz assumption in the finite-sum setting. Our SPIDER technique can be further applied to find an (, O(0.5))-approximate second-order stationary point at a gradient computation cost of Õ min(n1/2−2 +2.53).

Original languageEnglish (US)
Pages (from-to)689-699
Number of pages11
JournalAdvances in Neural Information Processing Systems
Volume2018-December
StatePublished - 2018
Externally publishedYes
Event32nd Conference on Neural Information Processing Systems, NeurIPS 2018 - Montreal, Canada
Duration: Dec 2 2018Dec 8 2018

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Spider: Near-optimal non-convex optimization via stochastic path integrated differential estimator'. Together they form a unique fingerprint.

Cite this