Abstract
We present a novel approach to learning an HMM whose outputs are distributed according to a parametric family. This is done by decoupling the learning task into two steps: first estimating the output parameters, and then estimating the hidden state transition probabilities. The first step is accomplished by fitting a mixture model to the output stationary distribution. Given the parameters of this mixture model, the second step is formulated as the solution of an easily solvable convex quadratic program. We provide an error analysis for the estimated transition probabilities and show they are robust to small perturbations in the estimates of the mixture parameters. Finally, we support our analysis with some encouraging empirical results.
Original language | American English |
---|---|
Pages | 1739-1747 |
Number of pages | 9 |
State | Published - 1 Jan 2013 |
Event | 30th International Conference on Machine Learning, ICML 2013 - Atlanta, GA, United States Duration: 16 Jun 2013 → 21 Jun 2013 |
Conference
Conference | 30th International Conference on Machine Learning, ICML 2013 |
---|---|
Country/Territory | United States |
City | Atlanta, GA |
Period | 16/06/13 → 21/06/13 |
All Science Journal Classification (ASJC) codes
- Human-Computer Interaction
- Sociology and Political Science