Novel lower bounds on the entropy rate of binary hidden Markov processes

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Recently, Samorodnitsky proved a strengthened version of Mrs. Gerber's Lemma, where the output entropy of a binary symmetric channel is bounded in terms of the average entropy of the input projected on a random subset of coordinates. Here, this result is applied for deriving novel lower bounds on the entropy rate of binary hidden Markov processes. For symmetric underlying Markov processes, our bound improves upon the best known bound in the very noisy regime. The nonsymmetric case is also considered, and explicit bounds are derived for Markov processes that satisfy the (1, ∞)-RLL constraint.

Original languageEnglish
Title of host publicationProceedings - ISIT 2016; 2016 IEEE International Symposium on Information Theory
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages690-694
Number of pages5
ISBN (Electronic)9781509018062
DOIs
StatePublished - 10 Aug 2016
Externally publishedYes
Event2016 IEEE International Symposium on Information Theory, ISIT 2016 - Barcelona, Spain
Duration: 10 Jul 201615 Jul 2016

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
Volume2016-August

Conference

Conference2016 IEEE International Symposium on Information Theory, ISIT 2016
Country/TerritorySpain
CityBarcelona
Period10/07/1615/07/16

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Information Systems
  • Modelling and Simulation
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Novel lower bounds on the entropy rate of binary hidden Markov processes'. Together they form a unique fingerprint.

Cite this