Sequential Empirical Coordination under an Output Entropy Constraint

Ehsan Shafieepoorfard, Maxim Raginsky

Research output: Contribution to journalArticle

Abstract

This paper considers the problem of sequential empirical coordination, where the objective is to achieve a given value of the expected uniform deviation between the state-action empirical averages and the statistical expectations under a given strategic probability measure with respect to a given universal Glivenko-Cantelli class of test functions. A communication constraint is imposed on the Shannon entropy of the resulting action sequence. It is shown that the fundamental limit on the output entropy is given by the minimum of the mutual information between the state and the action processes under all strategic measures that have the same marginal state process as the target measure and approximate the target measure to desired accuracy with respect to the underlying Glivenko-Cantelli seminorm. The fundamental limit is shown to be asymptotically achievable by the tree-structured codes.

Original languageEnglish (US)
Article number8387794
Pages (from-to)6830-6841
Number of pages12
JournalIEEE Transactions on Information Theory
Volume64
Issue number10
DOIs
StatePublished - Oct 2018

    Fingerprint

Keywords

  • Coordination via communication
  • causal source coding
  • empirical processes
  • sequential rate distortion

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Cite this