Information-theoretic applications of the logarithmic probability comparison bound

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

A well-known technique in assessing probabilities of rare events (used, e.g., in the sphere-packing bound), is that of finding a reference measure under which the event of interest has probability of order one and estimating the probability in question using the Kullback-Leibler divergence (KLD). A recent method has been proposed [2], that can be viewed as an extension of this idea in which the probability under the reference measure may itself be decaying exponentially, and the Rényi divergence (RD) is used instead. We demonstrate the usefulness of this approach in various information-theoretic settings. For channel coding, we provide a method for obtaining matched, mismatched and robust error exponent bounds, as well as new results in a variety of particular channel models. Other applications we address include rate-distortion coding and the problem of guessing.

Original languageEnglish
Title of host publicationProceedings - 2015 IEEE International Symposium on Information Theory, ISIT 2015
Pages735-739
Number of pages5
ISBN (Electronic)9781467377041
DOIs
StatePublished - 28 Sep 2015
EventIEEE International Symposium on Information Theory, ISIT 2015 - Hong Kong, Hong Kong
Duration: 14 Jun 201519 Jun 2015

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
Volume2015-June

Conference

ConferenceIEEE International Symposium on Information Theory, ISIT 2015
Country/TerritoryHong Kong
CityHong Kong
Period14/06/1519/06/15

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Information Systems
  • Modelling and Simulation
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Information-theoretic applications of the logarithmic probability comparison bound'. Together they form a unique fingerprint.

Cite this