Abstract
We introduce an axiomatic approach to entropies and relative entropies that relies only on minimal information-theoretic axioms, namely monotonicity under mixing and data-processing as well as additivity for product distributions. We find that these axioms induce sufficient structure to establish continuity in the interior of the probability simplex and meaningful upper and lower bounds, e.g., we find that every relative entropy satisfying these axioms must lie between the Rényi divergences of order 0 and infty . We further show simple conditions for positive definiteness of such relative entropies and a characterisation in terms of a variant of relative trumping. Our main result is a one-to-one correspondence between entropies and relative entropies.
| Original language | English |
|---|---|
| Pages (from-to) | 6313-6327 |
| Number of pages | 15 |
| Journal | IEEE Transactions on Information Theory |
| Volume | 67 |
| Issue number | 10 |
| DOIs | |
| State | Published - Oct 2021 |
| Externally published | Yes |
Keywords
- Information theory
- entropy
All Science Journal Classification (ASJC) codes
- Information Systems
- Computer Science Applications
- Library and Information Sciences
Fingerprint
Dive into the research topics of 'Entropy and Relative Entropy from Information-Theoretic Principles'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver