TY - GEN

T1 - Gossip gradient descent

AU - Liu, Yang

AU - Liu, Ji

AU - Başar, Tamer

PY - 2018/1/1

Y1 - 2018/1/1

N2 - We consider a problem of learning a linear regression model distributively with a network of N interconnected agents which receive private streaming data. Each agent can deploy an online learning algorithm, e.g. stochastic gradient descent, to learn adaptively the regression model using its receiving private data. The goal is to devise an algorithm for each agent, under the constraint that each of them can communicate only with its neighboring agents based on a communication graph, to enable each agent converge to the true model with a performance comparable to that of the traditional centralized solution. We propose an algorithm called gossip gradient descent, and establish (Equation presented) convergence in expectation and mean square, where λ2 is the second largest eigenvalue of the expected gossip matrix corresponding to the underlying communication graph. For the case when agents are privacy sensitive, we propose a differentially private variant of the algorithm, which achieves ∈-differential privacy and (Equation presented) convergence.

AB - We consider a problem of learning a linear regression model distributively with a network of N interconnected agents which receive private streaming data. Each agent can deploy an online learning algorithm, e.g. stochastic gradient descent, to learn adaptively the regression model using its receiving private data. The goal is to devise an algorithm for each agent, under the constraint that each of them can communicate only with its neighboring agents based on a communication graph, to enable each agent converge to the true model with a performance comparable to that of the traditional centralized solution. We propose an algorithm called gossip gradient descent, and establish (Equation presented) convergence in expectation and mean square, where λ2 is the second largest eigenvalue of the expected gossip matrix corresponding to the underlying communication graph. For the case when agents are privacy sensitive, we propose a differentially private variant of the algorithm, which achieves ∈-differential privacy and (Equation presented) convergence.

UR - http://www.scopus.com/inward/record.url?scp=85054743913&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85054743913&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:85054743913

SN - 9781510868083

T3 - Proceedings of the International Joint Conference on Autonomous Agents and Multiagent Systems, AAMAS

SP - 1995

EP - 1997

BT - 17th International Conference on Autonomous Agents and Multiagent Systems, AAMAS 2018

PB - International Foundation for Autonomous Agents and Multiagent Systems (IFAAMAS)

T2 - 17th International Conference on Autonomous Agents and Multiagent Systems, AAMAS 2018

Y2 - 10 July 2018 through 15 July 2018

ER -