Abstract
It is shown that under suitable regularity conditions, differential entropy is O(n)-Lipschitz as a function of probability distributions on ℝn with respect to the quadratic Wasserstein distance. Under similar conditions, (discrete) Shannon entropy is shown to be O(n)-Lipschitz in distributions over the product space with respect to Ornstein's d-distance (Wasserstein distance corresponding to the Hamming distance). These results together with Talagrand's and Marton's transportation-information inequalities allow one to replace the unknown multi-user interference with its independent identically distributed approximations. As an application, a new outer bound for the two-user Gaussian interference channel is proved, which, in particular, settles the missing corner point problem of Costa (1985).
Original language | English (US) |
---|---|
Article number | 7467523 |
Pages (from-to) | 3992-4002 |
Number of pages | 11 |
Journal | IEEE Transactions on Information Theory |
Volume | 62 |
Issue number | 7 |
DOIs | |
State | Published - Jul 2016 |
Keywords
- Entropy
- Interference channels
- Transport information inequality
- Wasserstein distance
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Library and Information Sciences