The role of local steps in local SGD

Tiancheng Qin, S. Rasoul Etesami, César A. Uribe

Research output: Contribution to journalArticlepeer-review

Abstract

We consider the distributed stochastic optimization problem where n agents want to minimize a global function given by the sum of agents' local functions and focus on the heterogeneous setting when agents' local functions are defined over non-i.i.d. datasets. We study the Local SGD method, where agents perform a number of local stochastic gradient steps and occasionally communicate with a central node to improve their local optimization tasks. We analyze the effect of local steps on the convergence rate and the communication complexity of Local SGD. In particular, instead of assuming a fixed number of local steps across all communication rounds, we allow the number of local steps during the jth communication round, (Formula presented.), to be different and arbitrary numbers. Our main contribution is to characterize the convergence rate of Local SGD as a function of (Formula presented.) under various settings of strongly convex, convex, and nonconvex local functions, where R is the total number of communication rounds. Based on this characterization, we provide sufficient conditions on the sequence (Formula presented.) such that Local SGD can achieve linear speedup with respect to the number of workers. Furthermore, we propose a new communication strategy with increasing local steps that is superior to constant local steps for strongly convex local functions. On the other hand, for convex and nonconvex local functions, we argue that fixed local steps are the best communication strategy for Local SGD and recover state-of-the-art convergence rate results. Finally, we justify our theoretical results through extensive numerical experiments.

Original languageEnglish (US)
JournalOptimization Methods and Software
Early online dateAug 7 2023
DOIs
StateE-pub ahead of print - Aug 7 2023
Externally publishedYes

Keywords

  • Federated learning
  • distributed optimization
  • local SGD

ASJC Scopus subject areas

  • Software
  • Control and Optimization
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'The role of local steps in local SGD'. Together they form a unique fingerprint.

Cite this