Proximal gradient method for nonsmooth optimization over the stiefel manifold

Shixiang Chen, Shiqian Ma, Anthony Man-Cho So, Tong Zhang

Research output: Contribution to journalArticlepeer-review

Abstract

We consider optimization problems over the Stiefel manifold whose objective function is the summation of a smooth function and a nonsmooth function. Existing methods for solving this kind of problem can be classified into three categories. Algorithms in the first category rely on information of the subgradients of the objective function and thus tend to converge slowly in practice. Algorithms in the second category are proximal point algorithms, which involve subproblems that can be as difficult as the original problem. Algorithms in the third category are based on operator-splitting techniques, but they usually lack rigorous convergence guarantees. In this paper, we propose a retraction-based proximal gradient method for solving this class of problems. We prove that the proposed method globally converges to a stationary point. Iteration complexity for obtaining an \epsilon -stationary solution is also analyzed. Numerical results on solving sparse PCA and compressed modes problems are reported to demonstrate the advantages of the proposed method.

Original languageEnglish (US)
Pages (from-to)210-239
Number of pages30
JournalSIAM Journal on Optimization
Volume30
Issue number1
DOIs
StatePublished - 2020
Externally publishedYes

Keywords

  • Compressed modes
  • Iteration complexity
  • Manifold optimization
  • Nonsmooth
  • Proximal gradient method
  • Semismooth Newton method
  • Sparse PCA
  • Stiefel manifold

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Proximal gradient method for nonsmooth optimization over the stiefel manifold'. Together they form a unique fingerprint.

Cite this