Distributed Boosting Classifiers over Noisy Channels

Yongjune Kim, Yuval Cassuto, Lav R. Varshney

Research output: Chapter in Book/Report/Conference proceedingConference contribution


We present a principled framework to address resource allocation for realizing boosting algorithms on substrates with communication noise. Boosting classifiers (e.g., AdaBoost) make a final decision via a weighted vote from local decisions of many base classifiers (weak classifiers). Suppose the base classifiers' outputs are communicated over noisy channels; these noisy outputs will degrade the final classification accuracy. We show this degradation can be effectively reduced by allocating more system resources for more important base classifiers. We formulate resource optimization problems in terms of importance metrics for boosting. Moreover, we show that the optimized noisy boosting classifiers can be more robust than bagging for noise during inference (test stage). We provide numerical evidence to demonstrate the benefits of our approach.

Original languageEnglish (US)
Title of host publicationConference Record of the 54th Asilomar Conference on Signals, Systems and Computers, ACSSC 2020
EditorsMichael B. Matthews
PublisherIEEE Computer Society
Number of pages6
ISBN (Electronic)9780738131269
StatePublished - Nov 1 2020
Event54th Asilomar Conference on Signals, Systems and Computers, ACSSC 2020 - Pacific Grove, United States
Duration: Nov 1 2020Nov 5 2020

Publication series

NameConference Record - Asilomar Conference on Signals, Systems and Computers
ISSN (Print)1058-6393


Conference54th Asilomar Conference on Signals, Systems and Computers, ACSSC 2020
Country/TerritoryUnited States
CityPacific Grove

ASJC Scopus subject areas

  • Signal Processing
  • Computer Networks and Communications


Dive into the research topics of 'Distributed Boosting Classifiers over Noisy Channels'. Together they form a unique fingerprint.

Cite this