Simplified separation of information and communication

Anup Rao, Makrand Sinha

Research output: Contribution to journalArticlepeer-review


We give an example of a Boolean function whose information complexity is exponentially smaller than its communication complexity. Such a separation was first demonstrated by Ganor, Kol and Raz (J. ACM 2016). We give a simpler proof of the same result. In the course of this simplification, we make several new contributions: we introduce a new communication lower-bound technique, the notion of a fooling distribution, which allows us to separate information and communication complexity, and we also give a more direct proof of the information complexity upper bound. We also prove a generalization of Shearer’s Lemma that may be of independent interest. A version of Shearer’s original lemma bounds the expected mutual information of a subset of random variables with another random variable, when the subset is chosen independently of all the random variables that are involved. Our generalization allows some dependence between the random subset and the random variables involved, and still gives us similar bounds with an appropriate error term.

Original languageEnglish (US)
Pages (from-to)1-29
Number of pages29
JournalTheory of Computing
Issue number20
StatePublished - 2018
Externally publishedYes


  • Communication complexity
  • Fooling distribution
  • Information complexity
  • Separation of complexity measures

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computational Theory and Mathematics


Dive into the research topics of 'Simplified separation of information and communication'. Together they form a unique fingerprint.

Cite this