Abstract

This paper introduces a federated learning framework that enables over-the-air computation via digital communications, using a new joint source-channel coding scheme. Without relying on channel state information at devices, this scheme employs lattice codes to both quantize model parameters and exploit interference from the devices. We propose a novel receiver structure at the server, designed to reliably decode an integer combination of the quantized model parameters as a lattice point for the purpose of aggregation. We present a mathematical approach to derive a convergence bound for the proposed scheme and offer design remarks. In this context, we suggest an aggregation metric and a corresponding algorithm to determine effective integer coefficients for the aggregation in each communication round. Our results illustrate that, regardless of channel dynamics and data heterogeneity, our scheme consistently delivers superior learning accuracy across various parameters and markedly surpasses other over-the-air methodologies.

Original languageEnglish (US)
Pages (from-to)5213-5227
Number of pages15
JournalIEEE Transactions on Signal Processing
Volume72
DOIs
StatePublished - 2024

Keywords

  • digital communications
  • Federated learning
  • lattice codes
  • machine learning
  • over-the-air computation

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Compute-Update Federated Learning: A Lattice Coding Approach'. Together they form a unique fingerprint.

Cite this