Concentration of measure inequalities in information theory, communications, and coding

Maxim Raginsky, Igal Sason

Research output: Contribution to journalArticle

Abstract

During the last two decades, concentration inequalities have been the subject of exciting developments in various areas, including convex geometry, functional analysis, statistical physics, high-dimensional statistics, pure and applied probability theory (e.g., concentration of measure phenomena in random graphs, random matrices, and percolation), information theory, theoretical computer science, and learning theory. This monograph focuses on some of the key modern mathematical tools that are used for the derivation of concentration inequalities, on their links to information theory, and on their various applications to communications and coding. In addition to being a survey, this monograph also includes various new recent results derived by the authors. The first part of the monograph introduces classical concentration inequalities for martingales, as well as some recent refinements and extensions. The power and versatility of the martingale approach is exemplified in the context of codes defined on graphs and iterative decoding algorithms, as well as codes for wireless communication. The second part of the monograph introduces the entropy method, an information-theoretic technique for deriving concentration inequalities. The basic ingredients of the entropy method are discussed first in the context of logarithmic Sobolev inequalities, which underlie the so-called functional approach to concentration of measure, and then from a complementary information-theoretic viewpoint based on transportation-cost inequalities and probability in metric spaces. Some representative results on concentration for dependent random variables are briefly summarized, with emphasis on their connections to the entropy method. Finally, we discuss several applications of the entropy method to problems in communications and coding, including strong converses, empirical distributions of good channel codes, and an information-theoretic converse for concentration of measure.

Original languageEnglish (US)
Pages (from-to)1-250
Number of pages250
JournalFoundations and Trends in Communications and Information Theory
Volume10
Issue number1-2
DOIs
StatePublished - Nov 29 2013

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Applied Mathematics

Fingerprint Dive into the research topics of 'Concentration of measure inequalities in information theory, communications, and coding'. Together they form a unique fingerprint.

  • Cite this