Smoothing Brascamp-Lieb Inequalities and Strong Converses of Coding Theorems

Jingbo Liu, Thomas A. Courtade, Paul Cuff, Sergio Verdu

Research output: Contribution to journalArticlepeer-review


The Brascamp-Lieb inequality in functional analysis can be viewed as a measure of the 'uncorrelatedness' of a joint probability distribution. We define the smooth Brascamp-Lieb (BL) divergence as the infimum of the best constant in the Brascamp-Lieb inequality under a perturbation of the joint probability distribution. An information spectrum upper bound on the smooth BL divergence is proved, using properties of the subgradient of a certain convex functional. In particular, in the i.i.d. setting, such an infimum converges to the best constant in a certain mutual information inequality. We then derive new single-shot converse bounds for the omniscient helper common randomness generation problem and the Gray-Wyner source coding problem in terms of the smooth BL divergence, where the proof relies on the functional formulation of the Brascamp-Lieb inequality. Exact second-order rates are thus obtained in the stationary memoryless and nonvanishing error setting. These offer rare instances of strong converses/second-order converses for continuous sources when the rate region involves auxiliary random variables.

Original languageEnglish (US)
Article number8896920
Pages (from-to)704-721
Number of pages18
JournalIEEE Transactions on Information Theory
Issue number2
StatePublished - Feb 2020
Externally publishedYes


  • Brascamp-Lieb inequality
  • coding theorems
  • common randomness
  • finite blocklength
  • Gray-Wyner network
  • hypercontractivity
  • Shannon theory
  • strong converse

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Fingerprint Dive into the research topics of 'Smoothing Brascamp-Lieb Inequalities and Strong Converses of Coding Theorems'. Together they form a unique fingerprint.

Cite this