TY - JOUR
T1 - Data processing theorems and the second law of thermodynamics
AU - Merhav, Neri
N1 - Funding Information: Manuscript received July 12, 2010; revised March 18, 2011; accepted April 27, 2011. Date of current version July 29, 2011. This work was supported by the Israel Science Foundation (ISF) under Grant 208/08. The author is with the Department of Electrical Engineering, Technion, Israel Institute of Technology, Haifa, 32000, Israel (e-mail: [email protected]. il). Communicated by P. O Vontobel, Associate Editor for Coding Techniques. Color versions of one or more of the figures in this paper are available online at http://ieeexplore.ieee.org. Digital Object Identifier 10.1109/TIT.2011.2159052
PY - 2011/8
Y1 - 2011/8
N2 - We draw relationships between the generalized data processing theorems of Zakai and Ziv (1973 and 1975) and the dynamical version of the second law of thermodynamics, a.k.a. the Boltzmann H-Theorem, which asserts that the Shannon entropy, H(Xt), pertaining to a finite-state Markov process {X t}, is monotonically nondecreasing as a function of time t, provided that the steady-state distribution of this process is uniform across the state space (which is the case when the process designates an isolated system). It turns out that both the generalized data processing theorems and the Boltzmann H-Theorem can be viewed as special cases of a more general principle concerning the monotonicity (in time) of a certain generalized information measure applied to a Markov process. This gives rise to a new look at the generalized data processing theorem, which suggests to exploit certain degrees of freedom that may lead to better bounds, for a given choice of the convex function that defines the generalized mutual information. Indeed, we demonstrate an example of a certain setup of joint source-channel coding, where this idea yields an improved lower bound on the distortion, relative to both the 1973 Ziv-Zakai lower bound and the lower bound obtained from the ordinary data processing theorem.
AB - We draw relationships between the generalized data processing theorems of Zakai and Ziv (1973 and 1975) and the dynamical version of the second law of thermodynamics, a.k.a. the Boltzmann H-Theorem, which asserts that the Shannon entropy, H(Xt), pertaining to a finite-state Markov process {X t}, is monotonically nondecreasing as a function of time t, provided that the steady-state distribution of this process is uniform across the state space (which is the case when the process designates an isolated system). It turns out that both the generalized data processing theorems and the Boltzmann H-Theorem can be viewed as special cases of a more general principle concerning the monotonicity (in time) of a certain generalized information measure applied to a Markov process. This gives rise to a new look at the generalized data processing theorem, which suggests to exploit certain degrees of freedom that may lead to better bounds, for a given choice of the convex function that defines the generalized mutual information. Indeed, we demonstrate an example of a certain setup of joint source-channel coding, where this idea yields an improved lower bound on the distortion, relative to both the 1973 Ziv-Zakai lower bound and the lower bound obtained from the ordinary data processing theorem.
KW - Convexity
KW - H-theorem
KW - data processing inequality
KW - detailed balance
KW - perspective function
KW - thermodynamics
UR - http://www.scopus.com/inward/record.url?scp=79960994624&partnerID=8YFLogxK
U2 - https://doi.org/10.1109/TIT.2011.2159052
DO - https://doi.org/10.1109/TIT.2011.2159052
M3 - مقالة
SN - 0018-9448
VL - 57
SP - 4926
EP - 4939
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 8
M1 - 5961833
ER -