Local-optimality guarantees for optimal decoding based on paths

Nissim Halabi, Guy Even

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

This paper presents a unified analysis framework that captures recent advances in the study of local-optimality characterizations for codes on graphs. These local-optimality characterizations are based on combinatorial structures embedded in the Tanner graph of the code. Local-optimality implies both maximum-likelihood (ML) optimality and linear-programming (LP) decoding optimality. Also, an iterative message-passing decoding algorithm is guaranteed to find the unique locally-optimal codeword, if one exists. We demonstrate this proof technique by considering a definition of local optimality that is based on the simplest combinatorial structures in Tanner graphs, namely, paths of length h. We apply the technique of local optimality to a family of Tanner codes. Inverse polynomial bounds in the code length are proved on the word error probability of LP-decoding for this family of Tanner codes.

Original languageEnglish
Title of host publication2012 7th International Symposium on Turbo Codes and Iterative Information Processing, ISTC 2012
Pages205-209
Number of pages5
DOIs
StatePublished - 2012
Event2012 7th International Symposium on Turbo Codes and Iterative Information Processing, ISTC 2012 - Gothenburg, Sweden
Duration: 27 Aug 201231 Aug 2012

Publication series

NameInternational Symposium on Turbo Codes and Iterative Information Processing, ISTC

Conference

Conference2012 7th International Symposium on Turbo Codes and Iterative Information Processing, ISTC 2012
Country/TerritorySweden
CityGothenburg
Period27/08/1231/08/12

All Science Journal Classification (ASJC) codes

  • Computational Theory and Mathematics
  • Information Systems
  • Theoretical Computer Science

Fingerprint

Dive into the research topics of 'Local-optimality guarantees for optimal decoding based on paths'. Together they form a unique fingerprint.

Cite this