Information Rates over Multi-View Channels

V. Arvind Rameshwar, Nir Weinberger

Research output: Contribution to journalArticlepeer-review

Abstract

We investigate the fundamental limits of reliable communication over multi-view channels, in which the channel output is comprised of a large number of independent noisy views of a transmitted symbol. We consider first the setting of multi-view discrete memoryless channels and then extend our results to general multi-view channels (using multi-letter formulas). We argue that the channel capacity and dispersion of such multi-view channels converge exponentially fast in the number of views to the entropy and varentropy of the input distribution, respectively. We identify the exact rate of convergence as the smallest Chernoff information between two conditional distributions of the output, conditioned on unequal inputs. For the special case of the deletion channel, we compute upper bounds on this Chernoff information. Finally, we present a new channel model we term the Poisson approximation channel - of possible independent interest - whose capacity closely approximates the capacity of the multi-view binary symmetric channel for any fixed number of views.

Original languageEnglish
Pages (from-to)847-861
Number of pages15
JournalIEEE Transactions on Information Theory
Volume71
Issue number2
DOIs
StatePublished - 2025

Keywords

  • Chernoff information
  • DNA-based data storage
  • Multi-view channels

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Cite this