On relations between the relative entropy and χ2-divergence, generalizations and applications

Tomohiro Nishiyama, Igal Sason

Research output: Contribution to journalArticlepeer-review

Abstract

The relative entropy and the chi-squared divergence are fundamental divergence measures in information theory and statistics. This paper is focused on a study of integral relations between the two divergences, the implications of these relations, their information-theoretic applications, and some generalizations pertaining to the rich class of f-divergences. Applications that are studied in this paper refer to lossless compression, the method of types and large deviations, strong data-processing inequalities, bounds on contraction coefficients and maximal correlation, and the convergence rate to stationarity of a type of discrete-time Markov chains.

Original languageEnglish
Article number563
JournalEntropy
Volume22
Issue number5
DOIs
StatePublished - 1 May 2020

Keywords

  • Chi-squared divergence
  • F-divergences
  • Information contraction
  • Large deviations
  • Maximal correlation;Markov chains
  • Method of types
  • Relative entropy
  • Strong data-processing inequalities

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Mathematical Physics
  • Physics and Astronomy (miscellaneous)
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'On relations between the relative entropy and χ2-divergence, generalizations and applications'. Together they form a unique fingerprint.

Cite this