Differentially Private Gossip Gradient Descent

Yang Liu, Ji Liu, Tamer Basar

Research output: Chapter in Book/Report/Conference proceedingConference contribution


In this paper, we study a problem of learning a linear regression model distributively with a network of N interconnected agents in which each agent can deploy an online learning algorithm to adaptively learn the regression model using its private data. The goal of the problem is to devise a distributed algorithm, under the constraint that each agent can communicate only with its neighbors depicted by a connected communication graph, which enables all N agents converge to the true model, with a performance comparable to that of conventional centralized algorithms. We propose a differentially private distributed algorithm, called private gossi» gradient descent, and establish E-differential privacy and Oleft(sqrt{frac{log {2}t}{epsilon(1-lambda-{2})Nt}}right) convergence, where A2 is the second largest eigenvalue of the expected gossip matrix corresponding to the communication graph.

Original languageEnglish (US)
Title of host publication2018 IEEE Conference on Decision and Control, CDC 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages6
ISBN (Electronic)9781538613955
StatePublished - Jul 2 2018
Externally publishedYes
Event57th IEEE Conference on Decision and Control, CDC 2018 - Miami, United States
Duration: Dec 17 2018Dec 19 2018

Publication series

NameProceedings of the IEEE Conference on Decision and Control
ISSN (Print)0743-1546
ISSN (Electronic)2576-2370


Conference57th IEEE Conference on Decision and Control, CDC 2018
Country/TerritoryUnited States

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Modeling and Simulation
  • Control and Optimization


Dive into the research topics of 'Differentially Private Gossip Gradient Descent'. Together they form a unique fingerprint.

Cite this