Abstract
While the Hidden Markov Model (HMM) is a versatile generative model of sequences capable of performing many exact inferences efficiently, it is not suited for capturing complex long-term structure in the data. Advanced state-space models based on Deep Neural Networks (DNN) overcome this limitation but cannot perform exact inferences. In this article, we present a new generative model for sequences that combines both aspects, the ability to perform exact inferences and the ability to model long-term structure, by augmenting the HMM with a deterministic, continuous state variable modeled through a Recurrent Neural Network. We empirically study the performance of the model on (i) synthetic data comparing it to the HMM, (ii) a supervised learning task in bioinformatics where it outperforms two DNN-based regressors and (iii) in the generative modeling of music where it outperforms many prominent DNN-based generative models.
Original language | English (US) |
---|---|
State | Published - 2020 |
Event | 22nd International Conference on Artificial Intelligence and Statistics, AISTATS 2019 - Naha, Japan Duration: Apr 16 2019 → Apr 18 2019 |
Conference
Conference | 22nd International Conference on Artificial Intelligence and Statistics, AISTATS 2019 |
---|---|
Country/Territory | Japan |
City | Naha |
Period | 4/16/19 → 4/18/19 |
ASJC Scopus subject areas
- Artificial Intelligence
- Statistics and Probability