Mutual information, relative entropy, and estimation in the poisson channel

Rami Atar, Tsachy Weissman

Research output: Contribution to journalArticlepeer-review

Abstract

Let X be a nonnegative random variable and let the conditional distribution of a random variable Y, given X, be Poisson (γ ̇ X), for a parameter γ ≥ 0. We identify a natural loss function such that: 1) the derivative of the mutual information between X and Y with respect to γ is equal to the minimum mean loss in estimating X based on Y, regardless of the distribution of X; 2) when X ∼ P is estimated based on Y by a mismatched estimator that would have minimized the expected loss had X ∼ Q, the integral over all values of γ of the excess mean loss is equal to the relative entropy between P and Q. For a continuous time setting where X is a nonnegative stochastic process and the conditional law of Y, given X, is that of a non-homogeneous Poisson process with intensity function γ ̇ X, under the same loss function: 1) the minimum mean loss in causal filtering when γ = γ 0 is equal to the expected value of the minimum mean loss in noncausal filtering (smoothing) achieved with a channel whose parameter γ is uniformly distributed between 0 and γ 0. Bridging the two quantities is the mutual information between X and Y; 2) this relationship between the mean losses in causal and noncausal filtering holds also in the case where the filters employed are mismatched, i.e., optimized assuming a law on X which is not the true one. Bridging the two quantities in this case is the sum of the mutual information and the relative entropy between the true and the mismatched distribution of Y. Thus, relative entropy quantifies the excess estimation loss due to mismatch in this setting. These results are parallel to those recently found for the Gaussian channel: the I-MMSE relationship of Guo , the relative entropy and mismatched estimation relationship of Verd, and the relationship between causal and noncasual mismatched estimation of Weissman.

Original languageEnglish
Article number6134671
Pages (from-to)1302-1318
Number of pages17
JournalIEEE Transactions on Information Theory
Volume58
Issue number3
DOIs
StatePublished - Mar 2012

Keywords

  • Causal estimation
  • Girsanov transformation
  • I-MMSE
  • Poisson channel
  • Shannon theory
  • divergence
  • mismatched estimation
  • mutual information
  • nonlinear filtering
  • point process
  • relative entropy
  • statistics

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Fingerprint

Dive into the research topics of 'Mutual information, relative entropy, and estimation in the poisson channel'. Together they form a unique fingerprint.

Cite this