Upper bounds on the relative entropy and Rényi divergence as a function of total variation distance for finite alphabets

Igal Sason, Sergio Verdu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

A new upper bound on the relative entropy is derived as a function of the total variation distance for probability measures defined on a common finite alphabet. The bound improves a previously reported bound by Csiszár and Talata. It is further extended to an upper bound on the Rényi divergence of an arbitrary non-negative order (including ∞) as a function of the total variation distance.

Original languageEnglish
Title of host publicationITW 2015 - 2015 IEEE Information Theory Workshop
Pages214-218
Number of pages5
ISBN (Electronic)9781467378529
DOIs
StatePublished - 17 Dec 2015
EventIEEE Information Theory Workshop, ITW 2015 - Jeju Island, Korea, Republic of
Duration: 11 Oct 201515 Oct 2015

Publication series

NameITW 2015 - 2015 IEEE Information Theory Workshop

Conference

ConferenceIEEE Information Theory Workshop, ITW 2015
Country/TerritoryKorea, Republic of
CityJeju Island
Period11/10/1515/10/15

Keywords

  • Pinsker's inequality
  • Rényi divergence
  • relative entropy
  • relative information
  • total variation distance

All Science Journal Classification (ASJC) codes

  • Information Systems

Fingerprint

Dive into the research topics of 'Upper bounds on the relative entropy and Rényi divergence as a function of total variation distance for finite alphabets'. Together they form a unique fingerprint.

Cite this