## Abstract

We give an example of a Boolean function whose information complexity is exponentially smaller than its communication complexity. Such a separation was first demonstrated by Ganor, Kol and Raz (J. ACM 2016). We give a simpler proof of the same result. In the course of this simplification, we make several new contributions: we introduce a new communication lower-bound technique, the notion of a fooling distribution, which allows us to separate information and communication complexity, and we also give a more direct proof of the information complexity upper bound. We also prove a generalization of Shearer’s Lemma that may be of independent interest. A version of Shearer’s original lemma bounds the expected mutual information of a subset of random variables with another random variable, when the subset is chosen independently of all the random variables that are involved. Our generalization allows some dependence between the random subset and the random variables involved, and still gives us similar bounds with an appropriate error term.

Original language | English (US) |
---|---|

Pages (from-to) | 1-29 |

Number of pages | 29 |

Journal | Theory of Computing |

Volume | 14 |

Issue number | 20 |

DOIs | |

State | Published - 2018 |

Externally published | Yes |

## Keywords

- Communication complexity
- Fooling distribution
- Information complexity
- Separation of complexity measures

## ASJC Scopus subject areas

- Theoretical Computer Science
- Computational Theory and Mathematics