GMRES-accelerated ADMM for quadratic objectives

Richard Y. Zhang, Jacob K. White

Research output: Contribution to journalArticlepeer-review


We consider the sequence acceleration problem for the alternating direction method of multipliers (ADMM) applied to a class of equality-constrained problems with strongly convex quadratic objectives, which frequently arise as the Newton subproblem of interior-point methods. Within this context, the ADMM update equations are linear, the iterates are confined within a Krylov subspace, and the general minimum residual (GMRES) algorithm is optimal in its ability to accelerate convergence. The basic ADMM method solves a Κ -conditioned problem in O(√Κ) iterations. We give theoretical justification and numerical evidence that the GMRES-accelerated variant consistently solves the same problem in O(Κ 1 / 4 ) iterations for an order-of-magnitude reduction in iterations, despite a worst-case bound of O(√Κ) iterations. The method is shown to be competitive against standard preconditioned Krylov subspace methods for saddle-point problems. The method is embedded within SeDuMi, a popular open-source solver for conic optimization written in MATLAB, and used to solve many large-scale semidefinite programs with error that decreases like O(1/k 2 ), instead of O(1/k), where k is the iteration index.

Original languageEnglish (US)
Pages (from-to)3025-3056
Number of pages32
JournalSIAM Journal on Optimization
Issue number4
StatePublished - Jan 1 2018
Externally publishedYes


  • ADMM
  • Alternating direction
  • Augmented Lagrangian
  • Krylov subspace
  • Method of multipliers
  • Sequence acceleration

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science


Dive into the research topics of 'GMRES-accelerated ADMM for quadratic objectives'. Together they form a unique fingerprint.

Cite this