Concentration of Measure Inequalities and Their Communication and Information-Theoretic Applications

Maxim Raginsky, Igal Sason

Research output: Contribution to journalArticlepeer-review

Abstract

During the last two decades, concentration of measure has been a subject of various exciting developments in convex
geometry, functional analysis, statistical physics, high-dimensional
statistics, probability theory, information theory, communications
and coding theory, computer science, and learning theory. One
common theme which emerges in these fields is probabilistic
stability: complicated, nonlinear functions of a large number of
independent or weakly dependent random variables often tend
to concentrate sharply around their expected values. Information
theory plays a key role in the derivation of concentration inequalities. Indeed, both the entropy method and the approach based
on transportation-cost inequalities are two major informationtheoretic paths toward proving concentration.
This brief survey is based on a recent monograph of the authors
in the Foundations and Trends in Communications and Information Theory, and a tutorial given by the authors at ISIT 2015.
It introduces information theorists to three main techniques for
deriving concentration inequalities: the martingale method, the
entropy method, and the transportation-cost inequalities. Some
applications in information theory, communications, and coding
theory are used to illustrate the main ideas.
Original languageAmerican English
Pages (from-to)24-35
JournalIEEE Information Theory Society Newsletter
Volume65
Issue number4
StatePublished - 2015

Fingerprint

Dive into the research topics of 'Concentration of Measure Inequalities and Their Communication and Information-Theoretic Applications'. Together they form a unique fingerprint.

Cite this