TY - GEN
T1 - Information limits for recovering a hidden community
AU - Hajek, Bruce
AU - Wu, Yihong
AU - Xu, Jiaming
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2016/8/10
Y1 - 2016/8/10
N2 - We study the problem of recovering a hidden community of cardinality K from an n × n symmetric data matrix A, where for distinct indices i; j, Aij ∼ P if i; j both belong to the community and Aij ∼ Q otherwise, for two known probability distributions P and Q depending on n. We focus on two types of asymptotic recovery guarantees as n → ∞: (1) weak recovery: expected number of classification errors is o(K); (2) exact recovery: probability of classifying all indices correctly converges to one. Under mild assumptions on P and Q, and allowing the community size to scale sublinearly with n, we derive a set of sufficient conditions and a set of necessary conditions for recovery, which are asymptotically tight with sharp constants. The results hold in particular for the Gaussian case (P = N(μ, 1) and Q = N(0; 1)), and for the case of bounded log likelihood ratio, including the Bernoulli case (P = Bern(p) and Q = Bern(q)) whenever p/q and 1-p/1-q are bounded away from zero and infinity. An important algorithmic implication is that, whenever exact recovery is information theoretically possible, any algorithm that provides weak recovery when the community size is concentrated near K can be upgraded to achieve exact recovery in linear additional time by a simple voting procedure.
AB - We study the problem of recovering a hidden community of cardinality K from an n × n symmetric data matrix A, where for distinct indices i; j, Aij ∼ P if i; j both belong to the community and Aij ∼ Q otherwise, for two known probability distributions P and Q depending on n. We focus on two types of asymptotic recovery guarantees as n → ∞: (1) weak recovery: expected number of classification errors is o(K); (2) exact recovery: probability of classifying all indices correctly converges to one. Under mild assumptions on P and Q, and allowing the community size to scale sublinearly with n, we derive a set of sufficient conditions and a set of necessary conditions for recovery, which are asymptotically tight with sharp constants. The results hold in particular for the Gaussian case (P = N(μ, 1) and Q = N(0; 1)), and for the case of bounded log likelihood ratio, including the Bernoulli case (P = Bern(p) and Q = Bern(q)) whenever p/q and 1-p/1-q are bounded away from zero and infinity. An important algorithmic implication is that, whenever exact recovery is information theoretically possible, any algorithm that provides weak recovery when the community size is concentrated near K can be upgraded to achieve exact recovery in linear additional time by a simple voting procedure.
UR - http://www.scopus.com/inward/record.url?scp=84986001236&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84986001236&partnerID=8YFLogxK
U2 - 10.1109/ISIT.2016.7541628
DO - 10.1109/ISIT.2016.7541628
M3 - Conference contribution
AN - SCOPUS:84986001236
T3 - IEEE International Symposium on Information Theory - Proceedings
SP - 1894
EP - 1898
BT - Proceedings - ISIT 2016; 2016 IEEE International Symposium on Information Theory
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2016 IEEE International Symposium on Information Theory, ISIT 2016
Y2 - 10 July 2016 through 15 July 2016
ER -