Mutual information, relative entropy, and estimation in the Poisson channel

Rami Atar, Tsachy Weissman

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Let X be a non-negative random variable and let the conditional distribution of a random variable Y, given X, be Poisson(γ · X), for a parameter γ0. We identify a natural loss function such that: • The derivative of the mutual information between X and Y with respect to γ is equal to the minimum mean loss in estimating X based on Y, regardless of the distribution of X. • When X ∼ P is estimated based on Y by a mismatched estimator that would have minimized the expected loss had X ∼ Q, the integral over all values of γ of the excess mean loss is equal to the relative entropy between P and Q. For a continuous time setting where X T = {Xt, 0 ≤ t ≤ T} is a non-negative stochastic process and the conditional law of XT = {Yt, 0 ≤ t ≤ T}, given XT , is that of a non-homogeneous Poisson process with intensity function γ · XT , under the same loss function: • The minimum mean loss in causal filtering when γ = γ0 is equal to the expected value of the minimum mean loss in non-causal filtering (smoothing) achieved with a channel whose parameter γ is uniformly distributed between 0 and 0. Bridging the two quantities is the mutual information between XT and XT· • This relationship between the mean losses in causal and non-causal filtering holds also in the case where the filters employed are mismatched, i.e., optimized assuming a law on XT which is not the true one. Bridging the two quantities in this case is the sum of the mutual information and the relative entropy between the true and the mismatched distribution of X T·. Thus, relative entropy quantifies the excess estimation loss due to mismatch in this setting. These results parallel those recently found for the Gaussian channel: the I-MMSE relationship of Guo Shamai and Verd u, the relative entropy and mismatched estimation relationship of Verdú, and the relationship between causal and non-casual mismatched estimation of Weissman.

Original languageEnglish
Title of host publication2011 IEEE International Symposium on Information Theory Proceedings, ISIT 2011
Pages708-712
Number of pages5
DOIs
StatePublished - 2011
Event2011 IEEE International Symposium on Information Theory Proceedings, ISIT 2011 - St. Petersburg, Russian Federation
Duration: 31 Jul 20115 Aug 2011

Publication series

NameIEEE International Symposium on Information Theory - Proceedings

Conference

Conference2011 IEEE International Symposium on Information Theory Proceedings, ISIT 2011
Country/TerritoryRussian Federation
CitySt. Petersburg
Period31/07/115/08/11

Keywords

  • Causal estimation
  • Divergence
  • Girsanov transformation
  • I-MMSE
  • Mismatched estimation
  • Mutual information
  • Nonlinear filtering
  • Point processes
  • Poisson channel
  • Relative entropy
  • Shannon theory
  • Statistics

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Information Systems
  • Modelling and Simulation
  • Applied Mathematics

Cite this