On the Renyi Divergence, Joint Range of Relative Entropies, and a Channel Coding Theorem

Research output: Contribution to journalArticlepeer-review

Abstract

This paper starts by considering the minimization of the Rényi divergence subject to a constraint on the total variation distance. Based on the solution of this optimization problem, the exact locus of the points (D(Q||P1), D(Q||P2)) is determined when P1, P2, and Q are arbitrary probability measures which are mutually absolutely continuous, and the total variation distance between P1 and P2 is not below a given value. It is further shown that all the points of this convex region are attained by probability measures which are defined on a binary alphabet. This characterization yields a geometric interpretation of the minimal Chernoff information subject to a constraint on the variational distance. This paper also derives an exponential upper bound on the performance of binary linear block codes (or code ensembles) under maximum-likelihood decoding. Its derivation relies on the Gallager bounding technique, and it reproduces the Shulman-Feder bound as a special case. The bound is expressed in terms of the Rényi divergence from the normalized distance spectrum of the code (or the average distance spectrum of the ensemble) to the binomially distributed distance spectrum of the capacity-achieving ensemble of random block codes. This exponential bound provides a quantitative measure of the degradation in performance of binary linear block codes (or code ensembles) as a function of the deviation of their distance spectra from the binomial distribution. An efficient use of this bound is considered.

Original languageEnglish
Article number7339482
Pages (from-to)23-34
Number of pages12
JournalIEEE Transactions on Information Theory
Volume62
Issue number1
DOIs
StatePublished - 1 Jan 2016

Keywords

  • Chernoff information
  • Renyi divergence
  • distance spectrum
  • error exponent
  • maximum-likelihood decoding
  • relative entropy
  • total variation distance

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Fingerprint

Dive into the research topics of 'On the Renyi Divergence, Joint Range of Relative Entropies, and a Channel Coding Theorem'. Together they form a unique fingerprint.

Cite this