Abstract
This paper provides upper and lower bounds on the optimal guessing moments of a random variable taking values on a finite set when side information may be available. These moments quantify the number of guesses required for correctly identifying the unknown object and, similarly to Arikan's bounds, they are expressed in terms of the Arimoto-Rényi conditional entropy. Although Arikan's bounds are asymptotically tight, the improvement of the bounds in this paper is significant in the non-asymptotic regime. Relationships between moments of the optimal guessing function and the MAP error probability are also established, characterizing the exact locus of their attainable values. The bounds on optimal guessing moments serve to improve non-asymptotic bounds on the cumulant generating function of the codeword lengths for fixed-to-variable optimal lossless source coding without prefix constraints. Non-asymptotic bounds on the reliability function of discrete memoryless sources are derived as well. Relying on these techniques, lower bounds on the cumulant generating function of the codeword lengths are derived, by means of the smooth Rényi entropy, for source codes that allow decoding errors.
Original language | English |
---|---|
Pages (from-to) | 4323-4346 |
Number of pages | 24 |
Journal | IEEE Transactions on Information Theory |
Volume | 64 |
Issue number | 6 |
DOIs | |
State | Published - Jun 2018 |
Keywords
- Arimoto-Renyi conditional entropy
- Cumulant generating function
- M-ary hypothesis testing
- Renyi divergence
- Renyi entropy
- guessing moments
- lossless source coding
- smooth Renyi entropy
All Science Journal Classification (ASJC) codes
- Information Systems
- Computer Science Applications
- Library and Information Sciences