Abstract
This work provides tight bounds on the R´enyi entropy of a
function of a discrete random variable with a finite number of possible
values, where the considered function is not one-to-one. To that end,
a tight lower bound on the R´enyi entropy of a discrete random variable
with a finite support is derived as a function of the size of the support,
and the ratio of the maximal to minimal probability masses. This work
was inspired by the recently published paper by Cicalese et al., which is
focused on the Shannon entropy, and it strengthens and generalizes the
results of that paper to R´enyi entropies of arbitrary positive orders. In
view of these generalized bounds and the works by Arikan and Campbell,
non-asymptotic bounds are derived for guessing moments and lossless data
compression of discrete memoryless sources. This talk is based on the
recently published paper: I. Sason, Tight bounds on the R´enyi entropy
via majorization with applications to guessing and compression, Entropy
(special issue on Probabilistic Methods in Information Theory, Hypothesis
Testing, and Coding), vol. 20, no. 12, paper 896, pp. 1–25, November
2018.
function of a discrete random variable with a finite number of possible
values, where the considered function is not one-to-one. To that end,
a tight lower bound on the R´enyi entropy of a discrete random variable
with a finite support is derived as a function of the size of the support,
and the ratio of the maximal to minimal probability masses. This work
was inspired by the recently published paper by Cicalese et al., which is
focused on the Shannon entropy, and it strengthens and generalizes the
results of that paper to R´enyi entropies of arbitrary positive orders. In
view of these generalized bounds and the works by Arikan and Campbell,
non-asymptotic bounds are derived for guessing moments and lossless data
compression of discrete memoryless sources. This talk is based on the
recently published paper: I. Sason, Tight bounds on the R´enyi entropy
via majorization with applications to guessing and compression, Entropy
(special issue on Probabilistic Methods in Information Theory, Hypothesis
Testing, and Coding), vol. 20, no. 12, paper 896, pp. 1–25, November
2018.
Original language | American English |
---|---|
State | Published - 2019 |
Event | Prague Stochastics - Prague Duration: 19 Aug 2019 → 23 Aug 2019 http://simu0292.utia.cas.cz/pragstoch2019/ |
Conference
Conference | Prague Stochastics |
---|---|
City | Prague |
Period | 19/08/19 → 23/08/19 |
Internet address |