TY - JOUR
T1 - Communication Efficient Curvature Aided Primal-Dual Algorithms for Decentralized Optimization
AU - Li, Yichuan
AU - Voulgaris, Petros G.
AU - Stipanović, Dušan M.
AU - Freris, Nikolaos M.
N1 - The work of Nikolaos M. Freris was supported by the Ministry of Science and Technology of China under Grant 2019YFB2102200. The work of Yichuan Li and Petros G. Voulgaris was supported by National Science Foundation under Grant CCF-1717154, Grant CPS-1932529, and Grant CMMI-2137764. Recommended by Associate Editor Y. Wang.
PY - 2023/11/1
Y1 - 2023/11/1
N2 - —This article presents a family of algorithms for decentralized convex composite problems. We consider the setting of a network of agents that cooperatively minimize a global objective function composed of a sum of local functions plus a regularizer. Through the use of intermediate consensus variables, we remove the need for inner communication loops between agents when computing curvature-guided updates. A general scheme is presented, which unifies the analysis for a plethora of computing choices, including gradient descent, Newton updates, and Broyden, Fletcher, Goldfarb, and Shanno updates. Our analysis establishes sublinear convergence rates under convex objective functions with Lipschitz continuous gradients, as well as linear convergence rates when the local functions are further assumed to be strongly convex. Moreover, we explicitly characterize the acceleration due to curvature information. Last but not the least, we present an asynchronous implementation for the proposed algorithms, which removes the need for a central clock, with linear convergence rates established in expectation under strongly convex objectives. We ascertain the effectiveness of the proposed methods with numerical experiments on benchmark datasets.
AB - —This article presents a family of algorithms for decentralized convex composite problems. We consider the setting of a network of agents that cooperatively minimize a global objective function composed of a sum of local functions plus a regularizer. Through the use of intermediate consensus variables, we remove the need for inner communication loops between agents when computing curvature-guided updates. A general scheme is presented, which unifies the analysis for a plethora of computing choices, including gradient descent, Newton updates, and Broyden, Fletcher, Goldfarb, and Shanno updates. Our analysis establishes sublinear convergence rates under convex objective functions with Lipschitz continuous gradients, as well as linear convergence rates when the local functions are further assumed to be strongly convex. Moreover, we explicitly characterize the acceleration due to curvature information. Last but not the least, we present an asynchronous implementation for the proposed algorithms, which removes the need for a central clock, with linear convergence rates established in expectation under strongly convex objectives. We ascertain the effectiveness of the proposed methods with numerical experiments on benchmark datasets.
KW - Asynchronous algorithms
KW - control
KW - decentralized optimization
KW - network analysis
KW - primal-dual algorithms
UR - http://www.scopus.com/inward/record.url?scp=85149378463&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85149378463&partnerID=8YFLogxK
U2 - 10.1109/TAC.2023.3244904
DO - 10.1109/TAC.2023.3244904
M3 - Article
AN - SCOPUS:85149378463
SN - 0018-9286
VL - 68
SP - 6573
EP - 6588
JO - IEEE Transactions on Automatic Control
JF - IEEE Transactions on Automatic Control
IS - 11
ER -