Learning minimal latent directed information polytrees

Jalal Etesami, Negar Kiyavash, Todd Coleman

Research output: Contribution to journalArticlepeer-review


We propose an approach for learning latent directed polytrees as long as there exists an appropriately defined discrepancy measure between the observed nodes. Specifically, we use our approach for learning directed information polytrees where samples are available from only a subset of processes. Directed information trees are a new type of probabilistic graphical models that represent the causal dynamics among a set of random processes in a stochastic system. We prove that the approach is consistent for learning minimal latent directed trees. We analyze the sample complexity of the learning task when the empirical estimator of mutual information is used as the discrepancy measure.

Original languageEnglish (US)
Pages (from-to)1723-1768
Number of pages46
JournalNeural Computation
Issue number9
StatePublished - Sep 1 2016

ASJC Scopus subject areas

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience


Dive into the research topics of 'Learning minimal latent directed information polytrees'. Together they form a unique fingerprint.

Cite this