Distributed Boosting Classifiers over Noisy Channels

Yongjune Kim, Yuval Cassuto, Lav R. Varshney

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

We present a principled framework to address resource allocation for realizing boosting algorithms on substrates with communication noise. Boosting classifiers (e.g., AdaBoost) make a final decision via a weighted vote from local decisions of many base classifiers (weak classifiers). Suppose the base classifiers' outputs are communicated over noisy channels; these noisy outputs will degrade the final classification accuracy. We show this degradation can be effectively reduced by allocating more system resources for more important base classifiers. We formulate resource optimization problems in terms of importance metrics for boosting. Moreover, we show that the optimized noisy boosting classifiers can be more robust than bagging for noise during inference (test stage). We provide numerical evidence to demonstrate the benefits of our approach.

Original languageEnglish
Title of host publicationConference Record of the 54th Asilomar Conference on Signals, Systems and Computers, ACSSC 2020
EditorsMichael B. Matthews
Pages1491-1496
Number of pages6
ISBN (Electronic)9780738131269
DOIs
StatePublished - 1 Nov 2020
Event54th Asilomar Conference on Signals, Systems and Computers, ACSSC 2020 - Pacific Grove, United States
Duration: 1 Nov 20205 Nov 2020

Publication series

NameConference Record - Asilomar Conference on Signals, Systems and Computers
Volume2020-November

Conference

Conference54th Asilomar Conference on Signals, Systems and Computers, ACSSC 2020
Country/TerritoryUnited States
CityPacific Grove
Period1/11/205/11/20

All Science Journal Classification (ASJC) codes

  • Signal Processing
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Distributed Boosting Classifiers over Noisy Channels'. Together they form a unique fingerprint.

Cite this