Illum information

Ravi Kiran Raman, Haizi Yu, Lav R Varshney

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Shannon's mutual information measures the degree of mutual dependence between two random variables. Two related information functionals have also been developed in the literature: Multiinformation, a multivariate extension of mutual information; and lautum information, the Csiszár conjugate of mutual information. In this work, we define illum information, the multivariate extension of lautum information and the Csiszár conjugate of multiinformation. We provide operational interpretations of this functional, including in the problem of independence testing of a set of random variables. Further, we also provide informational characterizations of illum information such as the data processing inequality and the chain rule for distributions on tree-structured graphical models. Finally, as illustrative examples, we compute the illum information for Ising models and Gauss-Markov random fields.

Original languageEnglish (US)
Title of host publication2017 Information Theory and Applications Workshop, ITA 2017
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781509052936
DOIs
StatePublished - Aug 30 2017
Event2017 Information Theory and Applications Workshop, ITA 2017 - San Diego, United States
Duration: Feb 12 2017Feb 17 2017

Publication series

Name2017 Information Theory and Applications Workshop, ITA 2017

Other

Other2017 Information Theory and Applications Workshop, ITA 2017
CountryUnited States
CitySan Diego
Period2/12/172/17/17

Fingerprint

Random variables
Ising model
Testing

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Software
  • Computational Theory and Mathematics

Cite this

Raman, R. K., Yu, H., & Varshney, L. R. (2017). Illum information. In 2017 Information Theory and Applications Workshop, ITA 2017 [8023479] (2017 Information Theory and Applications Workshop, ITA 2017). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ITA.2017.8023479

Illum information. / Raman, Ravi Kiran; Yu, Haizi; Varshney, Lav R.

2017 Information Theory and Applications Workshop, ITA 2017. Institute of Electrical and Electronics Engineers Inc., 2017. 8023479 (2017 Information Theory and Applications Workshop, ITA 2017).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Raman, RK, Yu, H & Varshney, LR 2017, Illum information. in 2017 Information Theory and Applications Workshop, ITA 2017., 8023479, 2017 Information Theory and Applications Workshop, ITA 2017, Institute of Electrical and Electronics Engineers Inc., 2017 Information Theory and Applications Workshop, ITA 2017, San Diego, United States, 2/12/17. https://doi.org/10.1109/ITA.2017.8023479
Raman RK, Yu H, Varshney LR. Illum information. In 2017 Information Theory and Applications Workshop, ITA 2017. Institute of Electrical and Electronics Engineers Inc. 2017. 8023479. (2017 Information Theory and Applications Workshop, ITA 2017). https://doi.org/10.1109/ITA.2017.8023479
Raman, Ravi Kiran ; Yu, Haizi ; Varshney, Lav R. / Illum information. 2017 Information Theory and Applications Workshop, ITA 2017. Institute of Electrical and Electronics Engineers Inc., 2017. (2017 Information Theory and Applications Workshop, ITA 2017).
@inproceedings{e0424e1435eb4a6ab18762fd9672103d,
title = "Illum information",
abstract = "Shannon's mutual information measures the degree of mutual dependence between two random variables. Two related information functionals have also been developed in the literature: Multiinformation, a multivariate extension of mutual information; and lautum information, the Csisz{\'a}r conjugate of mutual information. In this work, we define illum information, the multivariate extension of lautum information and the Csisz{\'a}r conjugate of multiinformation. We provide operational interpretations of this functional, including in the problem of independence testing of a set of random variables. Further, we also provide informational characterizations of illum information such as the data processing inequality and the chain rule for distributions on tree-structured graphical models. Finally, as illustrative examples, we compute the illum information for Ising models and Gauss-Markov random fields.",
author = "Raman, {Ravi Kiran} and Haizi Yu and Varshney, {Lav R}",
year = "2017",
month = "8",
day = "30",
doi = "10.1109/ITA.2017.8023479",
language = "English (US)",
series = "2017 Information Theory and Applications Workshop, ITA 2017",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
booktitle = "2017 Information Theory and Applications Workshop, ITA 2017",
address = "United States",

}

TY - GEN

T1 - Illum information

AU - Raman, Ravi Kiran

AU - Yu, Haizi

AU - Varshney, Lav R

PY - 2017/8/30

Y1 - 2017/8/30

N2 - Shannon's mutual information measures the degree of mutual dependence between two random variables. Two related information functionals have also been developed in the literature: Multiinformation, a multivariate extension of mutual information; and lautum information, the Csiszár conjugate of mutual information. In this work, we define illum information, the multivariate extension of lautum information and the Csiszár conjugate of multiinformation. We provide operational interpretations of this functional, including in the problem of independence testing of a set of random variables. Further, we also provide informational characterizations of illum information such as the data processing inequality and the chain rule for distributions on tree-structured graphical models. Finally, as illustrative examples, we compute the illum information for Ising models and Gauss-Markov random fields.

AB - Shannon's mutual information measures the degree of mutual dependence between two random variables. Two related information functionals have also been developed in the literature: Multiinformation, a multivariate extension of mutual information; and lautum information, the Csiszár conjugate of mutual information. In this work, we define illum information, the multivariate extension of lautum information and the Csiszár conjugate of multiinformation. We provide operational interpretations of this functional, including in the problem of independence testing of a set of random variables. Further, we also provide informational characterizations of illum information such as the data processing inequality and the chain rule for distributions on tree-structured graphical models. Finally, as illustrative examples, we compute the illum information for Ising models and Gauss-Markov random fields.

UR - http://www.scopus.com/inward/record.url?scp=85031003788&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85031003788&partnerID=8YFLogxK

U2 - 10.1109/ITA.2017.8023479

DO - 10.1109/ITA.2017.8023479

M3 - Conference contribution

AN - SCOPUS:85031003788

T3 - 2017 Information Theory and Applications Workshop, ITA 2017

BT - 2017 Information Theory and Applications Workshop, ITA 2017

PB - Institute of Electrical and Electronics Engineers Inc.

ER -