F-Divergence Inequalities

Igal Sason, Sergio Verdu

Research output: Contribution to journalArticlepeer-review

Abstract

This paper develops systematic approaches to obtain f -divergence inequalities, dealing with pairs of probability measures defined on arbitrary alphabets. Functional domination is one such approach, where special emphasis is placed on finding the best possible constant upper bounding a ratio of f -divergences. Another approach used for the derivation of bounds among f -divergences relies on moment inequalities and the logarithmic-convexity property, which results in tight bounds on the relative entropy and Bhattacharyya distance in terms of χ2 divergences. A rich variety of bounds are shown to hold under boundedness assumptions on the relative information. Special attention is devoted to the total variation distance and its relation to the relative information and relative entropy, including 'reverse Pinsker inequalities,' as well as on the Eγ divergence, which generalizes the total variation distance. Pinsker's inequality is extended for this type of f -divergence, a result which leads to an inequality linking the relative entropy and relative information spectrum. Integral expressions of the Renyi divergence in terms of the relative information spectrum are derived, leading to bounds on the Renyi divergence in terms of either the variational distance or relative entropy.

Original languageEnglish
Article number7552457
Pages (from-to)5973-6006
Number of pages34
JournalIEEE Transactions on Information Theory
Volume62
Issue number11
DOIs
StatePublished - Nov 2016

Keywords

  • Pinsker's inequality
  • Relative entropy
  • Rényi divergence
  • f-divergence
  • relative information
  • total variation distance

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Fingerprint

Dive into the research topics of 'F-Divergence Inequalities'. Together they form a unique fingerprint.

Cite this