TY - JOUR

T1 - Information-theoretic lower bounds for distributed function computation

AU - Xu, Aolin

AU - Raginsky, Maxim

N1 - Funding Information:
This work was supported in part by the NSF under grant CCF-1017564, in part by the CAREER Award under Grant CCF-1254041, in part by the Center for Science of Information, an NSF Science and Technology Center, under grant CCF-0939370, and in part by ONR under grant N00014-12-1-0998. This
Publisher Copyright:
© 1963-2012 IEEE.

PY - 2017/4

Y1 - 2017/4

N2 - We derive information-theoretic converses (i.e., lower bounds) for the minimum time required by any algorithm for distributed function computation over a network of point-to-point channels with finite capacity, where each node of the network initially has a random observation and aims to compute a common function of all observations to a given accuracy with a given confidence by exchanging messages with its neighbors. We obtain the lower bounds on computation time by examining the conditional mutual information between the actual function value and its estimate at an arbitrary node, given the observations in an arbitrary subset of nodes containing that node. The main contributions include the following. First, a lower bound on the conditional mutual information via so-called small ball probabilities, which captures the dependence of the computation time on the joint distribution of the observations at the nodes, the structure of the function, and the accuracy requirement. For linear functions, the small ball probability can be expressed by Lévy concentration functions of sums of independent random variables, for which tight estimates are available that lead to strict improvements over existing lower bounds on computation time. Second, an upper bound on the conditional mutual information via strong data processing inequalities, which complements and strengthens existing cutset-capacity upper bounds. Finally, a multi-cutset analysis that quantifies the loss (dissipation) of the information needed for computation as it flows across a succession of cutsets in the network. This analysis is based on reducing a general network to a line network with bidirectional links and self-links, and the results highlight the dependence of the computation time on the diameter of the network, a fundamental parameter that is missing from most of the existing lower bounds on computation time.

AB - We derive information-theoretic converses (i.e., lower bounds) for the minimum time required by any algorithm for distributed function computation over a network of point-to-point channels with finite capacity, where each node of the network initially has a random observation and aims to compute a common function of all observations to a given accuracy with a given confidence by exchanging messages with its neighbors. We obtain the lower bounds on computation time by examining the conditional mutual information between the actual function value and its estimate at an arbitrary node, given the observations in an arbitrary subset of nodes containing that node. The main contributions include the following. First, a lower bound on the conditional mutual information via so-called small ball probabilities, which captures the dependence of the computation time on the joint distribution of the observations at the nodes, the structure of the function, and the accuracy requirement. For linear functions, the small ball probability can be expressed by Lévy concentration functions of sums of independent random variables, for which tight estimates are available that lead to strict improvements over existing lower bounds on computation time. Second, an upper bound on the conditional mutual information via strong data processing inequalities, which complements and strengthens existing cutset-capacity upper bounds. Finally, a multi-cutset analysis that quantifies the loss (dissipation) of the information needed for computation as it flows across a succession of cutsets in the network. This analysis is based on reducing a general network to a line network with bidirectional links and self-links, and the results highlight the dependence of the computation time on the diameter of the network, a fundamental parameter that is missing from most of the existing lower bounds on computation time.

KW - Distributed function computation

KW - Lévy concentration function

KW - computation time

KW - cutset bound

KW - multi-cutset analysis

KW - small ball probability

KW - strong data processing inequality

UR - http://www.scopus.com/inward/record.url?scp=85017621932&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85017621932&partnerID=8YFLogxK

U2 - 10.1109/TIT.2017.2664813

DO - 10.1109/TIT.2017.2664813

M3 - Article

AN - SCOPUS:85017621932

SN - 0018-9448

VL - 63

SP - 2314

EP - 2337

JO - IRE Professional Group on Information Theory

JF - IRE Professional Group on Information Theory

IS - 4

M1 - 7843608

ER -