ZenoPS: A Distributed Learning System Integrating Communication Efficiency and Security

Research output: Contribution to journalArticlepeer-review

Abstract

Distributed machine learning is primarily motivated by the promise of increased computation power for accelerating training and mitigating privacy concerns. Unlike machine learning on a single device, distributed machine learning requires collaboration and communication among the devices. This creates several new challenges: (1) the heavy communication overhead can be a bottle-neck that slows down the training, and (2) the unreliable communication and weaker control over the remote entities make the distributed system vulnerable to systematic failures and malicious attacks. This paper presents a variant of stochastic gradient descent (SGD) with improved communication efficiency and security in distributed environments. Our contributions include (1) a new technique called error reset to adapt both infrequent synchronization and message compression for communication reduction in both synchronous and asynchronous training, (2) new score-based approaches for validating the updates, and (3) integration with both error reset and score-based validation. The proposed system provides communication reduction, both synchronous and asynchronous training, Byzantine tolerance, and local privacy preservation. We evaluate our techniques both theoretically and empirically.

Original languageEnglish (US)
Article number233
JournalAlgorithms
Volume15
Issue number7
DOIs
StatePublished - Jul 2022

Keywords

  • distributed
  • privacy
  • security
  • communication
  • SGD

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Numerical Analysis
  • Computational Theory and Mathematics
  • Computational Mathematics

Fingerprint

Dive into the research topics of 'ZenoPS: A Distributed Learning System Integrating Communication Efficiency and Security'. Together they form a unique fingerprint.

Cite this