## Abstract

One of the basic tenets in information theory, the data processing inequality states that the output divergence does not exceed the input divergence for any channel. For channels without input constraints, various estimates on the amount of such contraction are known, Dobrushin's coefficient for the total variation being perhaps the most well-known. This paper investigates channels with an average input cost constraint. It is found that, while the contraction coefficient typically equals one (no contraction), the information nevertheless dissipates. A certain nonlinear function, the Dobrushin curve of the channel, is proposed to quantify the amount of dissipation. Tools for evaluating the Dobrushin curve of additive-noise channels are developed based on coupling arguments. Some basic applications in stochastic control, uniqueness of Gibbs measures, and fundamental limits of noisy circuits are discussed. As an application, it is shown that, in the chain of n power-constrained relays and Gaussian channels, the end-to-end mutual information and maximal squared correlation decay as Θ(log log n/log n), which is in stark contrast with the exponential decay in chains of discrete channels. Similarly, the behavior of noisy circuits (composed of gates with bounded fan-in) and broadcasting of information on trees (of bounded degree) does not experience threshold behavior in the signal-to-noise ratio (SNR). Namely, unlike the case of discrete channels, the probability of bit error stays bounded away from 1/2 regardless of the SNR.

Original language | English (US) |
---|---|

Article number | 7279116 |

Pages (from-to) | 35-55 |

Number of pages | 21 |

Journal | IEEE Transactions on Information Theory |

Volume | 62 |

Issue number | 1 |

DOIs | |

State | Published - Jan 1 2016 |

## Keywords

- Broadcasting on trees
- Contraction
- Dobrushin's
- Gibbs measure
- Noisy circuits
- Stochastic control
- Strong data processing inequalities

## ASJC Scopus subject areas

- Information Systems
- Computer Science Applications
- Library and Information Sciences