Lower Bounds on Information Requirements for Causal Network Inference

Xiaohan Kang, Bruce Hajek

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Recovery of the causal structure of dynamic networks from noisy measurements has long been a problem of intense interest across many areas of science and engineering. Many algorithms have been proposed, but there is no work that compares the performance of the algorithms to converse bounds in a non-asymptotic setting. As a step to address this problem, this paper gives lower bounds on the error probability for causal network support recovery in a linear Gaussian setting. The bounds are based on the use of the Bhattacharyya coefficient for binary hypothesis testing problems with mixture probability distributions. Comparison of the bounds and the performance achieved by two representative recovery algorithms are given for sparse random networks based on the Erdős-Rényi model.

Original languageEnglish (US)
Title of host publication2021 IEEE International Symposium on Information Theory, ISIT 2021 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages754-759
Number of pages6
ISBN (Electronic)9781538682098
DOIs
StatePublished - Jul 12 2021
Event2021 IEEE International Symposium on Information Theory, ISIT 2021 - Virtual, Melbourne, Australia
Duration: Jul 12 2021Jul 20 2021

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
Volume2021-July
ISSN (Print)2157-8095

Conference

Conference2021 IEEE International Symposium on Information Theory, ISIT 2021
Country/TerritoryAustralia
CityVirtual, Melbourne
Period7/12/217/20/21

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Information Systems
  • Modeling and Simulation
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Lower Bounds on Information Requirements for Causal Network Inference'. Together they form a unique fingerprint.

Cite this