TY - JOUR
T1 - Communication and Randomness Lower Bounds for Secure Computation
AU - Data, Deepesh
AU - Prabhakaran, Vinod M.
AU - Prabhakaran, Manoj M.
N1 - Funding Information:
NSF under Grant 12-28856.
Publisher Copyright:
© 1963-2012 IEEE.
PY - 2016/7
Y1 - 2016/7
N2 - In secure multiparty computation (MPC), mutually distrusting users collaborate to compute a function of their private data without revealing any additional information about their data to the other users. While it is known that information theoretically secure MPC is possible among n users having access to private randomness and are pairwise connected by secure, noiseless, and bidirectional links against the collusion of less than n/2 users (in the honest-but-curious model; the threshold is n/3 in the malicious model), relatively little is known about the communication and randomness complexity of secure computation, i.e., the amount of communication and randomness required to compute securely. In this paper, we employ information theoretic techniques to obtain lower bounds on communication and randomness complexity of secure MPC. We restrict ourselves to a concrete interactive setting involving three users under which all functions are securely computable against corruption of individual users in the honest-but-curious model. We derive lower bounds for both the perfect security case (i.e., zero-error and no leakage of information) and asymptotic security (where the probability of error and information leakage vanish as block-length goes to \infty ). Our techniques include the use of a data processing inequality for residual information (i.e., the gap between mutual information and Gács-Körner common information), a new information inequality for three-user protocols, and the idea of distribution switching by which lower bounds computed under certain worst case scenarios can be shown to apply for the general case. Our lower bounds are shown to be tight for various functions of interest. In particular, we show concrete functions which have communication-ideal protocols, i.e., which achieve the minimum communication simultaneously on all links in the network. Also, we obtain the first explicit example of a function that incurs a higher communication cost than the input length, in the secure computation model of Feige et al. (26th Annual ACM Symposium on Theory of Computing, 1994), who had shown that such functions exist. We also show that our communication bounds imply tight lower bounds on the amount of randomness required by MPC protocols for many interesting functions.
AB - In secure multiparty computation (MPC), mutually distrusting users collaborate to compute a function of their private data without revealing any additional information about their data to the other users. While it is known that information theoretically secure MPC is possible among n users having access to private randomness and are pairwise connected by secure, noiseless, and bidirectional links against the collusion of less than n/2 users (in the honest-but-curious model; the threshold is n/3 in the malicious model), relatively little is known about the communication and randomness complexity of secure computation, i.e., the amount of communication and randomness required to compute securely. In this paper, we employ information theoretic techniques to obtain lower bounds on communication and randomness complexity of secure MPC. We restrict ourselves to a concrete interactive setting involving three users under which all functions are securely computable against corruption of individual users in the honest-but-curious model. We derive lower bounds for both the perfect security case (i.e., zero-error and no leakage of information) and asymptotic security (where the probability of error and information leakage vanish as block-length goes to \infty ). Our techniques include the use of a data processing inequality for residual information (i.e., the gap between mutual information and Gács-Körner common information), a new information inequality for three-user protocols, and the idea of distribution switching by which lower bounds computed under certain worst case scenarios can be shown to apply for the general case. Our lower bounds are shown to be tight for various functions of interest. In particular, we show concrete functions which have communication-ideal protocols, i.e., which achieve the minimum communication simultaneously on all links in the network. Also, we obtain the first explicit example of a function that incurs a higher communication cost than the input length, in the secure computation model of Feige et al. (26th Annual ACM Symposium on Theory of Computing, 1994), who had shown that such functions exist. We also show that our communication bounds imply tight lower bounds on the amount of randomness required by MPC protocols for many interesting functions.
UR - http://www.scopus.com/inward/record.url?scp=84976566211&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84976566211&partnerID=8YFLogxK
U2 - 10.1109/TIT.2016.2568207
DO - 10.1109/TIT.2016.2568207
M3 - Article
AN - SCOPUS:84976566211
SN - 0018-9448
VL - 62
SP - 3901
EP - 3929
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 7
M1 - 7469867
ER -