TY - GEN
T1 - Collapsed variational inference for Sum-Product Networks
AU - Zhao, Han
AU - Adel, Tameem
AU - Gordon, Geoff
AU - Amos, Brandon
N1 - Funding Information:
HZ and GG gratefully acknowledge support from ONR contract N000141512365. TA is partially funded by the Netherlands NWO, project 612.001.119. BA acknowledges support from the National Science Foundation (NSF) grant number CNS-1518865, the Intel Corporation, Google, Vodafone, NVIDIA, and the Conklin Kistler family fund.
PY - 2016
Y1 - 2016
N2 - Sum-Product Networks (SPNs) are probabilistic inference machines that admit exact inference in linear time in the size of the network. Existing parameter learning approaches for SPNs are largely based on the maximum likelihood principle and are subject to overfitting compared to more Bayesian approaches. Exact Bayesian posterior inference for SPNs is computationally intractable. Even approximation techniques such as standard variational inference and posterior sampling for SPNs are computationally infeasible even for networks of moderate size due to the large number of local latent variables per instance. In this work, we propose a novel deterministic collapsed variational inference algorithm for SPNs that is computationally efficient, easy to implement and at the same time allows us to incorporate prior information into the optimization formulation. Extensive experiments show a significant improvement in accuracy compared with a maximum likelihood based approach.
AB - Sum-Product Networks (SPNs) are probabilistic inference machines that admit exact inference in linear time in the size of the network. Existing parameter learning approaches for SPNs are largely based on the maximum likelihood principle and are subject to overfitting compared to more Bayesian approaches. Exact Bayesian posterior inference for SPNs is computationally intractable. Even approximation techniques such as standard variational inference and posterior sampling for SPNs are computationally infeasible even for networks of moderate size due to the large number of local latent variables per instance. In this work, we propose a novel deterministic collapsed variational inference algorithm for SPNs that is computationally efficient, easy to implement and at the same time allows us to incorporate prior information into the optimization formulation. Extensive experiments show a significant improvement in accuracy compared with a maximum likelihood based approach.
UR - http://www.scopus.com/inward/record.url?scp=84999006645&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84999006645&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:84999006645
T3 - 33rd International Conference on Machine Learning, ICML 2016
SP - 1981
EP - 2000
BT - 33rd International Conference on Machine Learning, ICML 2016
A2 - Weinberger, Kilian Q.
A2 - Balcan, Maria Florina
PB - International Machine Learning Society (IMLS)
T2 - 33rd International Conference on Machine Learning, ICML 2016
Y2 - 19 June 2016 through 24 June 2016
ER -