Abstract
The Brascamp-Lieb inequality in functional analysis can be viewed as a measure of the 'uncorrelatedness' of a joint probability distribution. We define the smooth Brascamp-Lieb (BL) divergence as the infimum of the best constant in the Brascamp-Lieb inequality under a perturbation of the joint probability distribution. An information spectrum upper bound on the smooth BL divergence is proved, using properties of the subgradient of a certain convex functional. In particular, in the i.i.d. setting, such an infimum converges to the best constant in a certain mutual information inequality. We then derive new single-shot converse bounds for the omniscient helper common randomness generation problem and the Gray-Wyner source coding problem in terms of the smooth BL divergence, where the proof relies on the functional formulation of the Brascamp-Lieb inequality. Exact second-order rates are thus obtained in the stationary memoryless and nonvanishing error setting. These offer rare instances of strong converses/second-order converses for continuous sources when the rate region involves auxiliary random variables.
Original language | English (US) |
---|---|
Article number | 8896920 |
Pages (from-to) | 704-721 |
Number of pages | 18 |
Journal | IEEE Transactions on Information Theory |
Volume | 66 |
Issue number | 2 |
DOIs | |
State | Published - Feb 2020 |
Externally published | Yes |
Keywords
- Brascamp-Lieb inequality
- coding theorems
- common randomness
- finite blocklength
- Gray-Wyner network
- hypercontractivity
- Shannon theory
- strong converse
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Library and Information Sciences