TY - JOUR
T1 - Data-processing inequalities based on a certain structured class of information measures with application to estimation theory
AU - Merhav, Neri
N1 - Funding Information: Manuscript received September 25, 2011; revised April 12, 2012; accepted April 13, 2012. Date of publication May 1, 2012; date of current version July 10, 2012. This work was supported by the Israeli Science Foundation under Grant 208/08. The author is with the Department of Electrical Engineering, Technion—Israel Institute of Technology, Haifa 32000, Israel (e-mail: [email protected]). Communicated by D. Palomar, Associate Editor for Shannon Theory. Digital Object Identifier 10.1109/TIT.2012.2197175
PY - 2012
Y1 - 2012
N2 - We study data-processing inequalities that are derived from a certain class of generalized information measures, where a series of convex functions and multiplicative likelihood ratios is nested alternately. While these information measures can be viewed as a special case of the most general Zakai-Ziv generalized information measure, this special nested structure calls for attention and motivates our study. Specifically, a certain choice of the convex functions leads to an information measure that extends the notion of the Bhattacharyya distance (or the Chernoff divergence): While the ordinary Bhattacharyya distance is based on the (weighted) geometric mean of two replicas of the channel's conditional distribution, the more general information measure allows an arbitrary number of such replicas. We apply the data-processing inequality induced by this information measure to a detailed study of lower bounds of parameter estimation under additive white Gaussian noise (AWGN) and show that in certain cases, tighter bounds can be obtained by using more than two replicas. While the resulting lower bound may not compete favorably with the best bounds available for the ordinary AWGN channel, the advantage of the new lower bound, relative to the other bounds, becomes significant in the presence of channel uncertainty, like unknown fading. This different behavior in the presence of channel uncertainty is explained by the convexity property of the information measure.
AB - We study data-processing inequalities that are derived from a certain class of generalized information measures, where a series of convex functions and multiplicative likelihood ratios is nested alternately. While these information measures can be viewed as a special case of the most general Zakai-Ziv generalized information measure, this special nested structure calls for attention and motivates our study. Specifically, a certain choice of the convex functions leads to an information measure that extends the notion of the Bhattacharyya distance (or the Chernoff divergence): While the ordinary Bhattacharyya distance is based on the (weighted) geometric mean of two replicas of the channel's conditional distribution, the more general information measure allows an arbitrary number of such replicas. We apply the data-processing inequality induced by this information measure to a detailed study of lower bounds of parameter estimation under additive white Gaussian noise (AWGN) and show that in certain cases, tighter bounds can be obtained by using more than two replicas. While the resulting lower bound may not compete favorably with the best bounds available for the ordinary AWGN channel, the advantage of the new lower bound, relative to the other bounds, becomes significant in the presence of channel uncertainty, like unknown fading. This different behavior in the presence of channel uncertainty is explained by the convexity property of the information measure.
KW - Bhattacharyya distance
KW - Chernoff divergence
KW - Gallager function
KW - data-processing inequality
KW - fading
KW - parameter estimation
UR - http://www.scopus.com/inward/record.url?scp=84863898151&partnerID=8YFLogxK
U2 - 10.1109/TIT.2012.2197175
DO - 10.1109/TIT.2012.2197175
M3 - مقالة
SN - 0018-9448
VL - 58
SP - 5287
EP - 5301
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 8
M1 - 6193211
ER -