Optimizing the Collaboration Structure in Cross-Silo Federated Learning

Wenxuan Bao, Haohan Wang, Jun Wu, Jingrui He

Research output: Contribution to journalConference articlepeer-review

Abstract

In federated learning (FL), multiple clients collaborate to train machine learning models together while keeping their data decentralized. Through utilizing more training data, FL suffers from the potential negative transfer problem: the global FL model may even perform worse than the models trained with local data only. In this paper, we propose FEDCOLLAB, a novel FL framework that alleviates negative transfer by clustering clients into non-overlapping coalitions based on their distribution distances and data quantities. As a result, each client only collaborates with the clients having similar data distributions, and tends to collaborate with more clients when it has less data. We evaluate our framework with a variety of datasets, models, and types of non-IIDness. Our results demonstrate that FEDCOLLAB effectively mitigates negative transfer across a wide range of FL algorithms and consistently outperforms other clustered FL algorithms.

Original languageEnglish (US)
Pages (from-to)1718-1736
Number of pages19
JournalProceedings of Machine Learning Research
Volume202
StatePublished - 2023
Event40th International Conference on Machine Learning, ICML 2023 - Honolulu, United States
Duration: Jul 23 2023Jul 29 2023

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Optimizing the Collaboration Structure in Cross-Silo Federated Learning'. Together they form a unique fingerprint.

Cite this