Learning Additive Noise Channels: Generalization Bounds and Algorithms

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

An additive noise channel is considered, in which the noise distribution is unknown and does not known to belong to any parametric family. The problem of designing a codebook and a generalized minimal distance decoder (which is parameterized by a covariance matrix) based on samples of the noise is considered. High probability generalization bounds for the error probability loss function, as well as for a hinge-type surrogate loss function are provided. A stochastic-gradient based alternating-minimization algorithm for the latter loss function is presented. Bounds on the average empirical error and generalization error are provided for a Gibbs based algorithm that gradually expurgates codewords from a large initial codebook to obtain a smaller codebook with improved error probability.

Original languageEnglish
Title of host publication2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages2586-2591
Number of pages6
ISBN (Electronic)9781728164328
DOIs
StatePublished - Jun 2020
Externally publishedYes
Event2020 IEEE International Symposium on Information Theory, ISIT 2020 - Los Angeles, United States
Duration: 21 Jul 202026 Jul 2020

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
Volume2020-June

Conference

Conference2020 IEEE International Symposium on Information Theory, ISIT 2020
Country/TerritoryUnited States
CityLos Angeles
Period21/07/2026/07/20

Keywords

  • Gibbs algorithm
  • additive noise channel
  • expurgation
  • generalization bounds
  • statistical learning
  • stochastic gradient descent

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Information Systems
  • Modelling and Simulation
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Learning Additive Noise Channels: Generalization Bounds and Algorithms'. Together they form a unique fingerprint.

Cite this