Abstract

We present a principled framework to address resource allocation for realizing boosting algorithms on substrates with communication noise. Boosting classifiers (e.g., AdaBoost) make a final decision via a weighted vote from local decisions of many base classifiers (weak classifiers). Suppose the base classifiers' outputs are communicated over noisy channels; these noisy outputs will degrade the final classification accuracy. We show this degradation can be effectively reduced by allocating more system resources for more important base classifiers. We formulate resource optimization problems in terms of importance metrics for boosting. Moreover, we show that the optimized noisy boosting classifiers can be more robust than bagging for noise during inference (test stage). We provide numerical evidence to demonstrate the benefits of our approach.

Original languageEnglish (US)
Title of host publicationConference Record of the 54th Asilomar Conference on Signals, Systems and Computers, ACSSC 2020
EditorsMichael B. Matthews
PublisherIEEE Computer Society
Pages1491-1496
Number of pages6
ISBN (Electronic)9780738131269
DOIs
StatePublished - Nov 1 2020
Event54th Asilomar Conference on Signals, Systems and Computers, ACSSC 2020 - Pacific Grove, United States
Duration: Nov 1 2020Nov 5 2020

Publication series

NameConference Record - Asilomar Conference on Signals, Systems and Computers
Volume2020-November
ISSN (Print)1058-6393

Conference

Conference54th Asilomar Conference on Signals, Systems and Computers, ACSSC 2020
Country/TerritoryUnited States
CityPacific Grove
Period11/1/2011/5/20

ASJC Scopus subject areas

  • Signal Processing
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Distributed Boosting Classifiers over Noisy Channels'. Together they form a unique fingerprint.

Cite this