TY - CONF
T1 - A recurrent Markov state-space generative model for sequences
AU - Ramachandran, Anand
AU - Lumetta, Steven S.
AU - Klee, Eric
AU - Chen, Deming
N1 - Funding Information:
This material is based upon work supported by the National Science Foundation (NSF) under Grant Nos. CNS 1624790, and CNS 1337732. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. We thank Tanmay Gangwani, Ashok Vardhan Makkuva, and Professors Mark Hasegawa-Johnson, Saurabh Sinha and Pramod Viswanath from the University of Illinois at Urbana-Champaign for helpful discussions.
Funding Information:
This material is based upon work supported by the National Science Foundation (NSF) under Grant Nos. CNS 1624790, and CNS 1337732. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.
Publisher Copyright:
© 2019 by the author(s).
PY - 2020
Y1 - 2020
N2 - While the Hidden Markov Model (HMM) is a versatile generative model of sequences capable of performing many exact inferences efficiently, it is not suited for capturing complex long-term structure in the data. Advanced state-space models based on Deep Neural Networks (DNN) overcome this limitation but cannot perform exact inferences. In this article, we present a new generative model for sequences that combines both aspects, the ability to perform exact inferences and the ability to model long-term structure, by augmenting the HMM with a deterministic, continuous state variable modeled through a Recurrent Neural Network. We empirically study the performance of the model on (i) synthetic data comparing it to the HMM, (ii) a supervised learning task in bioinformatics where it outperforms two DNN-based regressors and (iii) in the generative modeling of music where it outperforms many prominent DNN-based generative models.
AB - While the Hidden Markov Model (HMM) is a versatile generative model of sequences capable of performing many exact inferences efficiently, it is not suited for capturing complex long-term structure in the data. Advanced state-space models based on Deep Neural Networks (DNN) overcome this limitation but cannot perform exact inferences. In this article, we present a new generative model for sequences that combines both aspects, the ability to perform exact inferences and the ability to model long-term structure, by augmenting the HMM with a deterministic, continuous state variable modeled through a Recurrent Neural Network. We empirically study the performance of the model on (i) synthetic data comparing it to the HMM, (ii) a supervised learning task in bioinformatics where it outperforms two DNN-based regressors and (iii) in the generative modeling of music where it outperforms many prominent DNN-based generative models.
UR - http://www.scopus.com/inward/record.url?scp=85085017443&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85085017443&partnerID=8YFLogxK
M3 - Paper
AN - SCOPUS:85085017443
T2 - 22nd International Conference on Artificial Intelligence and Statistics, AISTATS 2019
Y2 - 16 April 2019 through 18 April 2019
ER -