Guaranteed scalable learning of latent tree models

Furong Huang, Robert Chen, Niranjan Uma Naresh, Jimeng Sun, Ioakeim Perros, Anima Anandkumar

Research output: Contribution to conferencePaperpeer-review

Abstract

We present an integrated approach for structure and parameter estimation in latent tree graphical models. Our overall approach follows a “divide-and-conquer” strategy that learns models over small groups of variables and iteratively merges onto a global solution. The structure learning involves combinatorial operations such as minimum spanning tree construction and local recursive grouping; the parameter learning is based on the method of moments and on tensor decompositions. Our method is guaranteed to correctly recover the unknown tree structure and the model parameters with low sample complexity for the class of linear multivariate latent tree models which includes discrete and Gaussian distributions, and Gaussian mixtures. Our bulk asynchronous parallel algorithm is implemented in parallel and the parallel computation complexity increases only logarithmically with the number of variables and linearly with dimensionality of each variable.

Original languageEnglish (US)
StatePublished - 2019
Externally publishedYes
Event35th Conference on Uncertainty in Artificial Intelligence, UAI 2019 - Tel Aviv, Israel
Duration: Jul 22 2019Jul 25 2019

Conference

Conference35th Conference on Uncertainty in Artificial Intelligence, UAI 2019
Country/TerritoryIsrael
CityTel Aviv
Period7/22/197/25/19

ASJC Scopus subject areas

  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Guaranteed scalable learning of latent tree models'. Together they form a unique fingerprint.

Cite this