On Renyi Entropy Power Inequalities

Eshed Ram, Igal Sason

Research output: Contribution to journalArticlepeer-review

Abstract

This paper gives improved Rényi entropy power inequalities (R-EPIs). Consider a sum S n = sum k=1 n X k of n independent continuous random vectors taking values on mathbb R d , and let α in [1, infty ]. An R-EPI provides a lower bound on the order-α Rényi entropy power of S n that, up to a multiplicative constant (which may depend in general on n, alpha , d ), is equal to the sum of the order- α Rényi entropy powers of the n random vectors X k k=1 n. For alpha =1 , the R-EPI coincides with the well-known entropy power inequality by Shannon. The first improved R-EPI is obtained by tightening the recent R-EPI by Bobkov and Chistyakov, which relies on the sharpened Young's inequality. A further improvement of the R-EPI also relies on convex optimization and results on rank-one modification of a real-valued diagonal matrix.

Original languageEnglish
Article number7587398
Pages (from-to)6800-6815
Number of pages16
JournalIEEE Transactions on Information Theory
Volume62
Issue number12
DOIs
StatePublished - Dec 2016

Keywords

  • Renyi entropy
  • Renyi entropy power
  • entropy power inequality

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Fingerprint

Dive into the research topics of 'On Renyi Entropy Power Inequalities'. Together they form a unique fingerprint.

Cite this