TY - GEN
T1 - The Input and Output Entropies of the k-Deletion/Insertion Channel with Small Radii
AU - Singhvi, Shubhransh
AU - Sabary, Omer
AU - Bar-Lev, Daniella
AU - Yaakobi, Eitan
N1 - Publisher Copyright: © 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - The channel output entropy of a transmitted word is the entropy of the possible channel outputs and similarly the input entropy of a received word is the entropy of all possible transmitted words. The goal of this work is to study these entropy values for the k-deletion, k-insertion channel, where exactly k symbols are deleted, inserted in the transmitted word, respectively. If all possible words are transmitted with the same probability then studying the input and output entropies is equivalent. For both the 1-insertion and 1-deletion channels, it is proved that among all words with a fixed number of runs, the input entropy is minimized for words with a skewed distribution of their run lengths and it is maximized for words with a balanced distribution of their run lengths. Among our results, we establish a conjecture by Atashpendar et al. which claims that for the binary 1-deletion, the input entropy is maximized for the alternating words. For the 2-deletion channel, it is proved that constant words with a single run minimize the input entropy.
AB - The channel output entropy of a transmitted word is the entropy of the possible channel outputs and similarly the input entropy of a received word is the entropy of all possible transmitted words. The goal of this work is to study these entropy values for the k-deletion, k-insertion channel, where exactly k symbols are deleted, inserted in the transmitted word, respectively. If all possible words are transmitted with the same probability then studying the input and output entropies is equivalent. For both the 1-insertion and 1-deletion channels, it is proved that among all words with a fixed number of runs, the input entropy is minimized for words with a skewed distribution of their run lengths and it is maximized for words with a balanced distribution of their run lengths. Among our results, we establish a conjecture by Atashpendar et al. which claims that for the binary 1-deletion, the input entropy is maximized for the alternating words. For the 2-deletion channel, it is proved that constant words with a single run minimize the input entropy.
UR - http://www.scopus.com/inward/record.url?scp=85144595320&partnerID=8YFLogxK
U2 - 10.1109/ITW54588.2022.9965878
DO - 10.1109/ITW54588.2022.9965878
M3 - منشور من مؤتمر
T3 - 2022 IEEE Information Theory Workshop, ITW 2022
SP - 564
EP - 569
BT - 2022 IEEE Information Theory Workshop, ITW 2022
T2 - 2022 IEEE Information Theory Workshop, ITW 2022
Y2 - 1 November 2022 through 9 November 2022
ER -