TY - JOUR

T1 - Dissipation of information in channels with input constraints

AU - Polyanskiy, Yury

AU - Wu, Yihong

N1 - Funding Information:
Y. Polyanskiy was supported in part by the Center for Science of Information, an National Science Foundation (NSF) Science and Technology Center under Grant CCF-09-39370 and in part by the NSF CAREER Award under Grant CCF-12-53205. Y. Wu was supported by NSF under Grant IIS-1447879 and Grant CCF-1423088. It is a pleasure to thank Max Raginsky (UIUC) for many helpful discussions and Flavio du Pin Calmon (MIT) for Proposition 12.
Publisher Copyright:
© 2015 IEEE.

PY - 2016/1/1

Y1 - 2016/1/1

N2 - One of the basic tenets in information theory, the data processing inequality states that the output divergence does not exceed the input divergence for any channel. For channels without input constraints, various estimates on the amount of such contraction are known, Dobrushin's coefficient for the total variation being perhaps the most well-known. This paper investigates channels with an average input cost constraint. It is found that, while the contraction coefficient typically equals one (no contraction), the information nevertheless dissipates. A certain nonlinear function, the Dobrushin curve of the channel, is proposed to quantify the amount of dissipation. Tools for evaluating the Dobrushin curve of additive-noise channels are developed based on coupling arguments. Some basic applications in stochastic control, uniqueness of Gibbs measures, and fundamental limits of noisy circuits are discussed. As an application, it is shown that, in the chain of n power-constrained relays and Gaussian channels, the end-to-end mutual information and maximal squared correlation decay as Θ(log log n/log n), which is in stark contrast with the exponential decay in chains of discrete channels. Similarly, the behavior of noisy circuits (composed of gates with bounded fan-in) and broadcasting of information on trees (of bounded degree) does not experience threshold behavior in the signal-to-noise ratio (SNR). Namely, unlike the case of discrete channels, the probability of bit error stays bounded away from 1/2 regardless of the SNR.

AB - One of the basic tenets in information theory, the data processing inequality states that the output divergence does not exceed the input divergence for any channel. For channels without input constraints, various estimates on the amount of such contraction are known, Dobrushin's coefficient for the total variation being perhaps the most well-known. This paper investigates channels with an average input cost constraint. It is found that, while the contraction coefficient typically equals one (no contraction), the information nevertheless dissipates. A certain nonlinear function, the Dobrushin curve of the channel, is proposed to quantify the amount of dissipation. Tools for evaluating the Dobrushin curve of additive-noise channels are developed based on coupling arguments. Some basic applications in stochastic control, uniqueness of Gibbs measures, and fundamental limits of noisy circuits are discussed. As an application, it is shown that, in the chain of n power-constrained relays and Gaussian channels, the end-to-end mutual information and maximal squared correlation decay as Θ(log log n/log n), which is in stark contrast with the exponential decay in chains of discrete channels. Similarly, the behavior of noisy circuits (composed of gates with bounded fan-in) and broadcasting of information on trees (of bounded degree) does not experience threshold behavior in the signal-to-noise ratio (SNR). Namely, unlike the case of discrete channels, the probability of bit error stays bounded away from 1/2 regardless of the SNR.

KW - Broadcasting on trees

KW - Contraction

KW - Dobrushin's

KW - Gibbs measure

KW - Noisy circuits

KW - Stochastic control

KW - Strong data processing inequalities

UR - http://www.scopus.com/inward/record.url?scp=84959184302&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84959184302&partnerID=8YFLogxK

U2 - 10.1109/TIT.2015.2482978

DO - 10.1109/TIT.2015.2482978

M3 - Article

AN - SCOPUS:84959184302

VL - 62

SP - 35

EP - 55

JO - IRE Professional Group on Information Theory

JF - IRE Professional Group on Information Theory

SN - 0018-9448

IS - 1

M1 - 7279116

ER -