Abstract
The relative entropy and the chi-squared divergence are fundamental divergence measures in information theory and statistics. This paper is focused on a study of integral relations between the two divergences, the implications of these relations, their information-theoretic applications, and some generalizations pertaining to the rich class of f-divergences. Applications that are studied in this paper refer to lossless compression, the method of types and large deviations, strong data-processing inequalities, bounds on contraction coefficients and maximal correlation, and the convergence rate to stationarity of a type of discrete-time Markov chains.
Original language | English |
---|---|
Article number | 563 |
Journal | Entropy |
Volume | 22 |
Issue number | 5 |
DOIs | |
State | Published - 1 May 2020 |
Keywords
- Chi-squared divergence
- F-divergences
- Information contraction
- Large deviations
- Maximal correlation;Markov chains
- Method of types
- Relative entropy
- Strong data-processing inequalities
All Science Journal Classification (ASJC) codes
- Information Systems
- Mathematical Physics
- Physics and Astronomy (miscellaneous)
- Electrical and Electronic Engineering