TY - JOUR
T1 - Sequential Empirical Coordination under an Output Entropy Constraint
AU - Shafieepoorfard, Ehsan
AU - Raginsky, Maxim
N1 - Funding Information:
Manuscript received October 27, 2017; revised May 19, 2018; accepted May 24, 2018. Date of publication June 18, 2018; date of current version September 13, 2018. This work was supported in part by NSF under CAREER Award CCF-1254041 and in part by the Center for Science of Information, an NSF Science and Technology Center, under Grant Agreement CCF-0939370. This paper was presented at the 2016 IEEE Conference on Decision and Control.
Publisher Copyright:
© 1963-2012 IEEE.
PY - 2018/10
Y1 - 2018/10
N2 - This paper considers the problem of sequential empirical coordination, where the objective is to achieve a given value of the expected uniform deviation between the state-action empirical averages and the statistical expectations under a given strategic probability measure with respect to a given universal Glivenko-Cantelli class of test functions. A communication constraint is imposed on the Shannon entropy of the resulting action sequence. It is shown that the fundamental limit on the output entropy is given by the minimum of the mutual information between the state and the action processes under all strategic measures that have the same marginal state process as the target measure and approximate the target measure to desired accuracy with respect to the underlying Glivenko-Cantelli seminorm. The fundamental limit is shown to be asymptotically achievable by the tree-structured codes.
AB - This paper considers the problem of sequential empirical coordination, where the objective is to achieve a given value of the expected uniform deviation between the state-action empirical averages and the statistical expectations under a given strategic probability measure with respect to a given universal Glivenko-Cantelli class of test functions. A communication constraint is imposed on the Shannon entropy of the resulting action sequence. It is shown that the fundamental limit on the output entropy is given by the minimum of the mutual information between the state and the action processes under all strategic measures that have the same marginal state process as the target measure and approximate the target measure to desired accuracy with respect to the underlying Glivenko-Cantelli seminorm. The fundamental limit is shown to be asymptotically achievable by the tree-structured codes.
KW - Coordination via communication
KW - causal source coding
KW - empirical processes
KW - sequential rate distortion
UR - http://www.scopus.com/inward/record.url?scp=85048640255&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85048640255&partnerID=8YFLogxK
U2 - 10.1109/TIT.2018.2848592
DO - 10.1109/TIT.2018.2848592
M3 - Article
AN - SCOPUS:85048640255
VL - 64
SP - 6830
EP - 6841
JO - IRE Professional Group on Information Theory
JF - IRE Professional Group on Information Theory
SN - 0018-9448
IS - 10
M1 - 8387794
ER -