It is shown that under suitable regularity conditions, differential entropy is O(n)-Lipschitz as a function of probability distributions on ℝn with respect to the quadratic Wasserstein distance. Under similar conditions, (discrete) Shannon entropy is shown to be O(n)-Lipschitz in distributions over the product space with respect to Ornstein's d-distance (Wasserstein distance corresponding to the Hamming distance). These results together with Talagrand's and Marton's transportation-information inequalities allow one to replace the unknown multi-user interference with its independent identically distributed approximations. As an application, a new outer bound for the two-user Gaussian interference channel is proved, which, in particular, settles the missing corner point problem of Costa (1985).
- Interference channels
- Transport information inequality
- Wasserstein distance
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Library and Information Sciences