Nonsmooth Optimization over the Stiefel Manifold and Beyond: Proximal Gradient Method and Recent Variants

Shixiang Chen, Shiqian Ma, Anthony Man Cho So, Tong Zhang

Research output: Contribution to journalArticlepeer-review

Abstract

We consider optimization problems over the Stiefel manifold whose objective function is the summation of a smooth function and a nonsmooth function. Existing methods for solving this class of problems converge slowly in practice, involve subproblems that can be as difficult as the original problem, or lack rigorous convergence guarantees. In this paper, we propose a manifold proximal gradient method (ManPG) for solving this class of problems. We prove that the proposed method converges globally to a stationary point and establish its iteration complexity for obtaining an \epsilon -stationary point. Furthermore, we present numerical results on the sparse PCA and compressed modes problems to demonstrate the advantages of the proposed method. We also discuss some recent advances related to ManPG for Riemannian optimization with nonsmooth objective functions.

Original languageEnglish (US)
Pages (from-to)319-352
Number of pages34
JournalSIAM Review
Volume66
Issue number2
DOIs
StatePublished - 2024
Externally publishedYes

Keywords

  • Stiefel manifold
  • iteration complexity
  • manifold optimization
  • nonsmooth
  • proximal gradient method
  • semismooth Newton method
  • stochastic algorithms
  • zeroth-order algorithms

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computational Mathematics
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Nonsmooth Optimization over the Stiefel Manifold and Beyond: Proximal Gradient Method and Recent Variants'. Together they form a unique fingerprint.

Cite this