Abstract
Gaussian state space models have been used for decades as generative models of sequential data. They admit an intuitive probabilistic interpretation, have a simple functional form, and enjoy widespread adoption. We introduce a unified algorithm to efficiently learn a broad class of linear and non-linear state space models, including variants where the emission and transition distributions are modeled by deep neural networks. Our learning algorithm simultaneously learns a compiled inference network and the generative model, leveraging a structured variational approximation parameterized by recurrent neural networks to mimic the posterior distribution. We apply the learning algorithm to both synthetic and real-world datasets, demonstrating its scalability and versatility. We find that using the structured approximation to the posterior results in models with significantly higher held-out likelihood.
| Original language | English |
|---|---|
| Pages | 2101-2109 |
| Number of pages | 9 |
| State | Published - 2017 |
| Externally published | Yes |
| Event | 31st AAAI Conference on Artificial Intelligence, AAAI 2017 - San Francisco, United States Duration: 4 Feb 2017 → 10 Feb 2017 |
Conference
| Conference | 31st AAAI Conference on Artificial Intelligence, AAAI 2017 |
|---|---|
| Country/Territory | United States |
| City | San Francisco |
| Period | 4/02/17 → 10/02/17 |
All Science Journal Classification (ASJC) codes
- Artificial Intelligence