TY - GEN
T1 - Rate Distortion via Constrained Estimated Mutual Information Minimization
AU - Tsur, Dor
AU - Huleihel, Bashar
AU - Permuter, Haim
N1 - Publisher Copyright: © 2023 IEEE.
PY - 2023/1/1
Y1 - 2023/1/1
N2 - This paper proposes a novel methodology for the estimation of the rate distortion function (RDF) in both continuous and discrete reconstruction spaces. The approach is input-space agnostic and does not require prior knowledge of the source distribution, nor the distortion function, i.e., it treats them as "black box"models. Thus, our method is a general solution to the RDF estimation problem. The approach leverages neural estimation and optimization of information measures to optimize a generative model of the input distribution. In continuous spaces we learn a sample generating model and a PMF model is proposed for discrete spaces. Formal guarantees of the proposed method are explored and implementation details are discussed. We demonstrate the performance on both high dimensional and large alphabet synthetic data. This work has the potential to contribute to the fields of data compression and machine learning through the development of provably consistent and competitive compressors optimized for the fundamental limit of the RDF.
AB - This paper proposes a novel methodology for the estimation of the rate distortion function (RDF) in both continuous and discrete reconstruction spaces. The approach is input-space agnostic and does not require prior knowledge of the source distribution, nor the distortion function, i.e., it treats them as "black box"models. Thus, our method is a general solution to the RDF estimation problem. The approach leverages neural estimation and optimization of information measures to optimize a generative model of the input distribution. In continuous spaces we learn a sample generating model and a PMF model is proposed for discrete spaces. Formal guarantees of the proposed method are explored and implementation details are discussed. We demonstrate the performance on both high dimensional and large alphabet synthetic data. This work has the potential to contribute to the fields of data compression and machine learning through the development of provably consistent and competitive compressors optimized for the fundamental limit of the RDF.
UR - http://www.scopus.com/inward/record.url?scp=85171482040&partnerID=8YFLogxK
U2 - 10.1109/ISIT54713.2023.10206867
DO - 10.1109/ISIT54713.2023.10206867
M3 - Conference contribution
T3 - IEEE International Symposium on Information Theory - Proceedings
SP - 695
EP - 700
BT - 2023 IEEE International Symposium on Information Theory, ISIT 2023
T2 - 2023 IEEE International Symposium on Information Theory, ISIT 2023
Y2 - 25 June 2023 through 30 June 2023
ER -